Compare commits

..

373 Commits

Author SHA1 Message Date
Niq Dudfield
fa524d91b0 fix: remove .mise.toml that conflicts with CI mise-action (#689) 2026-02-19 08:55:35 +07:00
tequ
e06b48d144 Support new STIs for sto_* HookAPI (#657) 2026-02-18 11:48:13 +10:00
tequ
445f28ed30 Add new keylets to util_keylet (#533) 2026-02-18 11:18:59 +10:00
Niq Dudfield
36a99d4abc Merge dev (309e517e7) into sync-2.4.0: docs + guard checker CI (#683)
* Hook API Refactor2: Amendment Guards (#621)

* Hook API Refactor3: Consolidate the Hook API definitions from Enum.h and ApplyHook.h into a single file. (#622)

* Hook API Refactoring / Unit Testing (#581)

* fix `Xahau Ledger` to `Xahau Network` (#651)

* Add GitHub Actions workflow for Guard Checker Build (#658)

* fix `Xahau Ledger` to `Xahau Network` (#651)

* Add GitHub Actions workflow for Guard Checker Build (#658)

* fix: update guard checker build path for directory restructure

* fix: update stale ripple include paths in hook headers

* fix(test): avoid deleted PublicKey default ctor in HookAPI test

* chore(levelization): update ordering after hook/protocol dependency change

---------

Co-authored-by: tequ <git@tequ.dev>
2026-02-18 11:17:27 +10:00
Niq Dudfield
a7900a7f36 Merge dev (d20927237) into sync-2.4.0: HookAPI refactor (#681)
* Hook API Refactor2: Amendment Guards (#621)

* Hook API Refactor3: Consolidate the Hook API definitions from Enum.h and ApplyHook.h into a single file. (#622)

* Hook API Refactoring / Unit Testing (#581)

* Hook API Refactor2: Amendment Guards (#621)

* Hook API Refactor3: Consolidate the Hook API definitions from Enum.h and ApplyHook.h into a single file. (#622)

* Hook API Refactoring / Unit Testing (#581)

* fix: update clang-format to v18 and fix include ordering

- Update verify-generated-headers CI to use clang-format 18 (matching
  clang-format.yml) instead of stale v10 which can't parse .clang-format
- Add .mise.toml for local clang-format 18 tooling
- Fix include ordering in cherry-picked files per clang-format 18

* chore: update levelization results for HookAPI changes

New loop: xrpl.hook <-> xrpld.app due to HookAPI.h including
Transaction.h from xrpld.app.

---------

Co-authored-by: tequ <git@tequ.dev>
2026-02-16 18:51:04 +10:00
tequ
94e5325ba9 Update CMake version to 3.25.3 in macOS workflow 2025-12-17 12:53:20 +09:00
tequ
1a3a072f30 Fix differences such as LedgerHash that occurred due to NetworkID in ltFeeSettings 2025-12-01 19:28:23 +09:00
tequ
7a8ff63064 fix InvalidTxFlags Amendment to default Yes 2025-10-29 16:49:37 +09:00
tequ
3a52ac66e4 Add ltORACLE for Remarks target (#562) 2025-08-18 16:17:49 +09:00
tequ
3391390af4 Conan Release Builder (2.4.0 sync) (#528) 2025-08-14 15:29:04 +09:00
tequ
92f80ecc78 Add Remit test to AMM Account 2025-07-19 19:18:03 +09:00
tequ
22710f7604 Optimize AccountDelete and Creadentials tests, Update tests priority 2025-07-19 04:49:25 +09:00
tequ
cc62b3cab0 Update Hook headers 2025-07-16 19:38:51 +09:00
tequ
5f7887f058 VoteBehavior::DefaultYes for new fix Amendments
- NFToken related fix Amendments remains as `DefaultNo`.
2025-07-14 19:27:09 +09:00
tequ
1556f35203 Add TSH processing for AMM, AMMClawback, Clawback, Oracle (#532)
* Add TSH processing for `AMM`, `AMMClawback`, `Oracle`, `Clawback`

* Add empty TSH processing for other transaction types

* Add AMMTsh tests
2025-07-10 23:26:32 +09:00
tequ
81ac5b1b97 Support Hook execution in simulate RPC (#531) 2025-07-05 14:32:42 +09:00
tequ
4bdf1a7024 add tests for DeepFreeze 2025-07-01 19:35:35 +09:00
tequ
ac8bfb5e33 Supported::No for featurePermissionedDomains 2025-07-01 16:08:01 +09:00
tequ
aceea296fe Supported::No for featureDynamicNFT 2025-07-01 16:01:27 +09:00
tequ
4d474bc6a3 Supported::No for featureCredentials 2025-07-01 15:54:07 +09:00
tequ
ea51512712 Supported::No for featureMPTokensV1 2025-07-01 15:22:52 +09:00
tequ
03a74aff6d Supported::No for featureNFTokenMintOffer 2025-07-01 15:14:08 +09:00
tequ
5b4659ea26 Supported::No for featureDID 2025-07-01 15:09:00 +09:00
tequ
03d9647179 Supported::No for featureXChainBridge 2025-07-01 14:52:24 +09:00
tequ
2fcf541e2c Combine AMM Amendments (#521)
* fixAMMv1_2
* fixAMMv1_1
* fixAMMOverflowOffer
* fixLPTokenTransfer
* suppress AMM test logs
* exclude `ltAMM` from `fixPreviousTxnID` Amendment
    - make `sfPreviousTxnID` and `sfPreviousTxnLgrSeq` required for ltAMM
2025-07-01 13:51:32 +09:00
tequ
e04fd61041 Combine XChainBridge Amendments (#523) 2025-06-30 19:12:33 +09:00
tequ
8368603760 Combine fixInnerObjTemplate Amendments (#524) 2025-06-30 18:14:04 +09:00
tequ
f237d515b0 fix to DefaultNo for featureDeletableAccounts 2025-06-30 17:15:46 +09:00
tequ
943192071c Disable instrumentation-build workflow (#530) 2025-06-30 16:54:11 +09:00
Niq Dudfield
a555a3efbc fix: prevent SOCI from linking ALL boost libraries (#529)
SOCI's vendored conanfile was using boost::boost which links against
every single Boost library (40+ libraries) when only boost::headers
is needed for SOCI's template specializations (boost::optional and
boost::gregorian::date support).

This was causing excessive linking and potential symbol conflicts,
particularly on Linux CI where boost_stacktrace_from_exception was
causing multiple definition errors with libstdc++.

Changed SOCI's boost dependency from boost::boost to boost::headers
since SOCI only needs Boost headers for its template specializations,
not the compiled libraries. The project already provides all necessary
Boost libraries through the ripple_boost target.

This reduces the linked libraries from 40+ down to just the ~14 that
the project actually uses, fixing the Linux CI build failures and
reducing binary size.

Note: The SOCI Conan recipe for Conan 2.0 already implements this
fix correctly.
2025-06-30 13:16:10 +09:00
tequ
bfdcc22f6e Reduce numFeatures for DID Amendments combine 2025-06-28 21:36:25 +09:00
Niq Dudfield
2cf4b3ec4c fix: remove vestigial -DBOOST_ASIO_DISABLE_CONCEPTS usage (#526) 2025-06-27 16:41:30 +09:00
Denis Angell
b4b6e4261b fix cmake & xrpl_core 2025-06-25 10:30:30 +02:00
Denis Angell
f1a1812e6e Update build-full.sh 2025-06-25 10:03:13 +02:00
Denis Angell
e43ea4962e fix cmake 2025-06-25 09:52:53 +02:00
Denis Angell
8e8e42046a Update build-full.sh 2025-06-25 09:26:48 +02:00
Denis Angell
901fec93a7 add DeepFreeze to trustTransferAllowed 2025-06-25 09:15:05 +02:00
tequ
75a4a72b3f fix release-builder, workflow building 2025-06-24 21:27:20 +09:00
tequ
17346c588d remove checkpatterns workflow 2025-06-24 19:57:41 +09:00
tequ
2e8e1b3940 Combine DID Amendments (#522)
fixEmptyDID -> featureDID
2025-06-23 21:13:52 +09:00
tequ
1997a53b77 Additional support for HookDefinition, HookState, ImportVLSequence at fixPreviousTxnID Amendment 2025-06-23 17:59:40 +09:00
Mark Travis
bfdaa3a5aa Log detailed correlated consensus data together (#5302)
Combine multiple related debug log data points into a single
message. Allows quick correlation of events that
previously were either not logged or, if logged, strewn
across multiple lines, making correlation difficult.
The Heartbeat Timer and consensus ledger accept processing
each have this capability.

Also guarantees that log entries will be written if the
node is a validator, regardless of log severity level.
Otherwise, the level of these messages is at INFO severity.
2025-06-20 15:30:36 +09:00
Mark Travis
be33576c65 fix: Acquire previously failed transaction set from network as new proposal arrives (#5318)
Reset the failure variable.
2025-06-20 15:12:58 +09:00
Bronek Kozicki
51dc38d77a Fix Replace assert with XRPL_ASSERT (#5312) 2025-06-20 15:12:20 +09:00
Bronek Kozicki
299b31154e fix: Remove 'new parent hash' assert (#5313)
This assert is known to occasionally trigger, without causing errors
downstream. It is replaced with a log message.
2025-06-20 14:58:59 +09:00
Ed Hennis
87bad9f0f6 Add logging and improve counting of amendment votes from UNL (#5173)
* Add logging for amendment voting decision process
* When counting "received validations" to determine quorum, count the number of validators actually voting, not the total number of possible votes.
2025-06-20 14:58:48 +09:00
Bart
3036bfb3aa docs: Revert peer port to 51235 (#5299)
Reverts the [port_peer] back to the legacy port 51235 rather than to the default port 2459, to avoid potentially inconveniencing existing operators.
2025-06-20 14:58:32 +09:00
Olek
370139bcc9 fix: Switch Permissioned Domain to Supported::yes (#5287)
Switch Permissioned Domain feature's supported flag from Supported::no to Supported::yes for it to be votable.
2025-06-20 14:58:20 +09:00
Bart
ec3c280ecb docs: Clarifies default port of hosts (#5290)
The current comment in the example cfg file incorrectly mentions both "may" and "must". This change fixes this comment to clarify that the default port of hosts is 2459 and that specifying it is therefore optional. It further sets the default port to 2459 instead of the legacy 51235.
2025-06-20 14:58:07 +09:00
Mark Travis
0bc83ed8f1 Log proposals and validations (#5291)
Adds detailed log messages for each validation and proposal received from the network.
2025-06-20 14:57:59 +09:00
Bart
a7438eccc5 Support canonical ledger entry names (#5271)
This change enhances the filtering in the ledger, ledger_data, and account_objects methods by also supporting filtering by the canonical name of the LedgerEntryType using case-insensitive matching.
2025-06-20 14:18:08 +09:00
Ed Hennis
9aa039e70b refactor: Change recursive_mutex to mutex in DatabaseRotatingImp (#5276)
Rewrites the code so that the lock is not held during the callback. Instead it locks twice, once before, and once after. This is safe due to the structure of the code, but is checked after the second lock. This allows mutex_ to be changed back to a regular mutex.
2025-06-20 14:17:06 +09:00
Bart
839dd92ee3 fix: Replace charge() by fee_.update() in OnMessage functions (#5269)
In PeerImpl.cpp, if the function is a message handler (onMessage) or called directly from a message handler, then it should use fee_, since when the handler returns (OnMessageEnd) then the charge function is called. If the function is not a message handler, such as a job queue item, it should remain charge.
2025-06-20 13:44:30 +09:00
Elliot Lee
3539537d19 docs: ensure build_type and CMAKE_BUILD_TYPE match (#5274) 2025-06-20 13:43:59 +09:00
code0xff
3ab975fcba chore: Fix small typos in protocol files (#5279) 2025-06-20 13:43:42 +09:00
Ed Hennis
cef8364d4a docs: Add a summary of the git commit message rules (#5283) 2025-06-20 13:43:03 +09:00
Olek
551653faf0 fix: Amendment to add transaction flag checking functionality for Credentials (#5250)
CredentialCreate / CredentialAccept / CredentialDelete transactions will check sfFlags field in preflight() when the amendment is enabled.
2025-06-20 13:42:18 +09:00
Donovan Hide
e846356de4 fix: Omit superfluous setCurrentThreadName call in GRPCServer.cpp (#5280) 2025-06-20 11:21:55 +09:00
Bronek Kozicki
cf9bc2b562 fix: Do not allow creating Permissioned Domains if credentials are not enabled (#5275)
If the permissioned domains amendment XLS-80 is enabled before credentials XLS-70, then the permissioned domain users will not be able to match any credentials. The changes here prevent the creation of any permissioned domain objects if credentials are not enabled.
2025-06-20 11:21:45 +09:00
Mayukha Vadari
246ab7ec95 fix: issues in simulate RPC (#5265)
Make `simulate` RPC easier to use:
* Prevent the use of `seed`, `secret`, `seed_hex`, and `passphrase` fields (to avoid confusing with the signing methods).
* Add autofilling of the `NetworkID` field.
2025-06-20 11:21:33 +09:00
Bart
1f168ddf50 Updates Conan dependencies (#5256)
This PR updates several Conan dependencies:
* boost
* date
* libarchive
* libmysqlclient
* libpq
* lz4
* onetbb
* openssl
* sqlite3
* zlib
* zstd
2025-06-20 11:21:21 +09:00
Shawn Xie
e17a2a6dc2 Amendment fixFrozenLPTokenTransfer (#5227)
Prohibits LPToken holders from sending LPToken to others if they have been frozen by one of the assets in AMM pool.
2025-06-20 11:04:00 +09:00
Ed Hennis
fbcbfc7647 Improve git commit hash lookup (#5225)
- Also get the branch name.
- Use rev-parse instead of describe to get a clean hash.
- Return the git hash and branch name in server_info for admin
  connections.
- Include git hash and branch name on separate lines in --version.
2025-06-20 10:58:40 +09:00
Vlad
5ca6214e68 Add deep freeze feature (XLS-77d) (#5187)
- spec: XRPLF/XRPL-Standards#220
- amendment: "DeepFreeze"
- implemented deep freeze spec to allow token issuers to prevent currency holders from being able to acquire more of these tokens.
- in combination with normal freeze, deep freeze effectively prevents any balance trust line balance change of a currency holder (except direct issuer <-> holder payments).
- added 2 new invariant checks to verify that deep freeze cannot be enacted without normal freeze and transfer is not frozen.
- made some fixes to existing freeze handling.

Co-authored-by: Ed Hennis <ed@ripple.com>
Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
2025-06-20 10:52:31 +09:00
Mayukha Vadari
2205228fa4 Add RPC "simulate" to execute a dry run of a transaction (#5069)
- Spec: https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0069d-simulate
- Also update signing methods to autofill fees better and properly handle transactions that require a non-standard fee.
2025-06-20 10:03:49 +09:00
Olek
da95739560 Fix CI unit tests (#5196)
- Add retries for rpc client
- Add dynamic port allocation for rpc servers
2025-06-20 02:28:17 +09:00
Michael Legleux
26058536dd Update secp256k1 library to 0.6.0 (#5254) 2025-06-20 01:35:15 +09:00
Bronek Kozicki
cf73b07574 Add [validator_list_threshold] to validators.txt to improve UNL security (#5112) 2025-06-20 01:34:38 +09:00
Bronek Kozicki
a9bd9b5f81 Switch from assert to XRPL_ASSERT (#5245) 2025-06-20 01:29:51 +09:00
tequ
606ca86627 Add missing space character to a log message (#5251) 2025-06-20 01:29:43 +09:00
Bronek Kozicki
6775cd59a6 Cleanup API-CHANGELOG.md (#5207) 2025-06-20 01:29:24 +09:00
Ed Hennis
4d36144b4b test: Unit tests to recreate invalid index logic error (#5242)
* One hits the global cache, one does not.
* Also some extra checking.

Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
2025-06-20 01:29:02 +09:00
Sergey Kuznetsov
1d29747513 fix: Error consistency in LedgerEntry::parsePermissionedDomains() (#5252)
Update errors for parsing permissioned domains in the LedgerEntry handler to make them consistent with other parsers.
2025-06-20 00:56:42 +09:00
Ed Hennis
8a953dd215 fix: Use consistent CMake settings for all modules (#5228)
* Resolves an issue introduced in #5111, which inadvertently removed the
  -Wno-maybe-uninitialized compiler option from some xrpl.libxrpl
  modules. This resulted in new "may be used uninitialized" build
  warnings, first noticed in the "protocol" module. When compiling with
  derr=TRUE, those warnings became errors, which made the build fail.
* Github CI actions will build with the assert and werr options turned
  on. This will cause CI jobs to fail if a developer introduces a new
  compiler warning, or causes an assert to fail in release builds.
* Includes the OS and compiler version in the linux dependencies jobs in
  the "check environment" step.
* Translates the `unity` build option into `CMAKE_UNITY_BUILD` setting.
2025-06-20 00:56:18 +09:00
Valentin Balaschenko
8382cb8c5d Fix levelization script to ignore commented includes (#5194)
Check to ignore single-line comments during dependency analysis.
2025-06-20 00:45:59 +09:00
tequ
1ba8d426d3 Fix the flag processing of NFTokenModify (#5246)
Adds checks for invalid flags.
2025-06-20 00:45:51 +09:00
Mayukha Vadari
c268f745d5 Fix failing assert in connect RPC (#5235) 2025-06-20 00:45:43 +09:00
Olek
34745e1231 Permissioned Domains (XLS-80d) (#5161) 2025-06-20 00:45:28 +09:00
tequ
7095542e5a XLS-46: DynamicNFT (#5048)
This Amendment adds functionality to update the URI of NFToken objects as described in the XLS-46d: Dynamic Non Fungible Tokens (dNFTs) spec.
2025-06-20 00:27:50 +09:00
Shawn Xie
cb02cbbf2b prefix Uint384 and Uint512 with Hash in server_definitions (#5231) 2025-06-20 00:10:23 +09:00
Mayukha Vadari
bde601a8ce refactor: add rpcName to LEDGER_ENTRY macro (#5202)
The LEDGER_ENTRY macro now takes an additional parameter, which makes it easier to avoid missing including the new field in jss.h and to the list of account_objects/ledger_data filters.
2025-06-20 00:08:47 +09:00
Michael Legleux
8094dda803 fix: Add header for set_difference (#5197)
Fix `error C2039: 'set_difference': is not a member of 'std'`
2025-06-19 23:42:21 +09:00
Mayukha Vadari
bd2c5f2c6e fix: allow overlapping types in Expected (#5218)
For example, Expected<std::uint32_t, Json::Value>, will now build even though there is animplicit conversion from unsigned int to Json::Value.
2025-06-19 23:42:12 +09:00
Gregory Tsipenyuk
a0348fb99d Add MPTIssue to STIssue (#5200)
Replace Issue in STIssue with Asset. STIssue with MPTIssue is only used in MPT tests.
Will be used in Vault and in transactions with STIssue fields once MPT is integrated into DEX.
2025-06-19 23:41:59 +09:00
Bronek Kozicki
c4c519ec53 Antithesis instrumentation improvements (#5213)
* Rename ASSERT to XRPL_ASSERT
* Upgrade to Anthithesis SDK 0.4.4, and use new 0.4.4 features
  * automatic cast to bool, like assert
* Add instrumentation workflow to verify build with instrumentation enabled
2025-06-19 23:27:49 +09:00
John Freeman
a390b10045 Enforce levelization in libxrpl with CMake (#5111)
Adds two CMake functions:

* add_module(library subdirectory): Declares an OBJECT "library" (a CMake abstraction for a collection of object files) with sources from the given subdirectory of the given library, representing a module. Isolates the module's headers by creating a subdirectory in the build directory, e.g. .build/tmp123, that contains just a symlink, e.g. .build/tmp123/basics, to the module's header directory, e.g. include/xrpl/basics, in the source directory, and putting .build/tmp123 (but not include/xrpl) on the include path of the module sources. This prevents the module sources from including headers not explicitly linked to the module in CMake with target_link_libraries.
* target_link_modules(library scope modules...): Links the library target to each of the module targets, and removes their sources from its source list (so they are not compiled and linked twice).

Uses these functions to separate and explicitly link modules in libxrpl:

    Level 01: beast
    Level 02: basics
    Level 03: json, crypto
    Level 04: protocol
    Level 05: resource, server
2025-06-19 23:06:46 +09:00
Mayukha Vadari
dd15993ce8 refactor: clean up LedgerEntry.cpp (#5199)
Refactors LedgerEntry to make it easier to read and understand.
2025-06-19 21:49:47 +09:00
Ed Hennis
45c762c7d5 test: Add more test cases for Base58 parser (#5174)
---------
Co-authored-by: John Freeman <jfreeman08@gmail.com>
2025-06-19 19:57:12 +09:00
Ed Hennis
90154e923d test: Check for some unlikely null dereferences in tests (#5004) 2025-06-19 19:57:04 +09:00
Bronek Kozicki
1db3181add Add Antithesis intrumentation (#5042)
* Copy Antithesis SDK version 0.4.0 to directory external/
* Add build option `voidstar` to enable instrumentation with Antithesis SDK
* Define instrumentation macros ASSERT and UNREACHABLE in terms of regular C assert
* Replace asserts with named ASSERT or UNREACHABLE
* Add UNREACHABLE to LogicError
* Document instrumentation macros in CONTRIBUTING.md
2025-06-19 19:56:21 +09:00
Valentin Balaschenko
0719f6f1ef Reduce the peer charges for well-behaved peers:
- Fix an erroneous high fee penalty that peers could incur for sending
  older transactions.
- Update to the fees charged for imposing a load on the server.
- Prevent the relaying of internal pseudo-transactions.
  - Before: Pseudo-transactions received from a peer will fail the signature
    check, even if they were requested (using TMGetObjectByHash), because
    they have no signature. This causes the peer to be charge for an
    invalid signature.
  - After: Pseudo-transactions, are put into the global cache
    (TransactionMaster) only. If the transaction is not part of
    a TMTransactions batch, the peer is charged an unwanted data fee.
    These fees will not be a problem in the normal course of operations,
    but should dissuade peers from behaving badly by sending a bunch of
    junk.
- Improve logging: include the reason for fees charged to a peer.

Co-authored-by: Ed Hennis <ed@ripple.com>
2025-06-19 17:05:23 +09:00
tequ
1bc34cb7cd update actions/upload-artifact to v4
https://github.blog/changelog/2024-04-16-deprecation-notice-v3-of-the-artifact-actions/
2025-06-19 16:11:34 +09:00
tequ
22ba71ebce levelization 2025-06-19 16:05:02 +09:00
tequ
06edda714f clang-format, ignore magic_enum.h 2025-06-19 15:56:33 +09:00
tequ
0151603701 fix ltDID type ID 2025-06-19 15:50:17 +09:00
tequ
91d64d5a61 Update ServerDefinition 2025-06-19 15:05:42 +09:00
Elliot Lee
782f828c37 refactor(AMMClawback): move tfClawTwoAssets check (#5201)
Move tfClawTwoAssets check to preflight and return
error temINVALID_FLAG

---------

Co-authored-by: yinyiqian1 <yqian@ripple.com>
2025-06-19 10:21:18 +09:00
Elliot Lee
996f5538d1 Add a new serialized type: STNumber (#5121)
`STNumber` lets objects and transactions contain multiple fields for
quantities of XRP, IOU, or MPT without duplicating information about the
"issue" (represented by `STIssue`). It is a straightforward serialization of
the `Number` type that uniformly represents those quantities.

---------

Co-authored-by: John Freeman <jfreeman08@gmail.com>
Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
2025-06-19 10:21:07 +09:00
Olek
ff80878a3e fix: check for valid ammID field in amm_info RPC (#5188) 2025-06-19 10:20:59 +09:00
Mayukha Vadari
bdadfbfdc2 fix: include index in server_definitions RPC (#5190) 2025-06-19 10:20:44 +09:00
Bronek Kozicki
eff3d52820 Fix ledger_entry crash on invalid credentials request (#5189) 2025-06-19 10:19:13 +09:00
Shawn Xie
8a95efdc73 Replace Uint192 with Hash192 in server_definitions response (#5177) 2025-06-19 10:17:50 +09:00
Bronek Kozicki
4ba1987605 Fix potential deadlock (#5124)
* 2.2.2 changed functions acquireAsync and NetworkOPsImp::recvValidation to add an item to a collection under lock, unlock, do some work, then lock again to do remove the item. It will deadlock if an exception is thrown while adding the item - before unlocking.
* Replace ScopedUnlock with scope_unlock.
2025-06-19 10:14:43 +09:00
Olek
2cdbda93f3 Introduce Credentials support (XLS-70d): (#5103)
Amendment:
    - Credentials

    New Transactions:
    - CredentialCreate
    - CredentialAccept
    - CredentialDelete

    Modified Transactions:
    - DepositPreauth
    - Payment
    - EscrowFinish
    - PaymentChannelClaim
    - AccountDelete

    New Object:
    - Credential

    Modified Object:
    - DepositPreauth

    API updates:
    - ledger_entry
    - account_objects
    - ledger_data
    - deposit_authorized

    Read full spec: https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0070d-credentials
2025-06-19 10:14:24 +09:00
Gregory Tsipenyuk
a6c499bd66 Fix token comparison in Payment (#5172)
* Checks only Currency or MPT Issuance ID part of the Asset object.
* Resolves temREDUNDANT regression detected in testing.
2025-06-19 00:21:21 +09:00
Gregory Tsipenyuk
ca79080fbc Add fixAMMv1_2 amendment (#5176)
* Add reserve check on AMM Withdraw
* Try AMM max offer if changeSpotPriceQuality() fails
2025-06-19 00:21:00 +09:00
Gregory Tsipenyuk
4ac26cd934 Fix unity build (#5179) 2025-06-19 00:00:06 +09:00
yinyiqian1
835e7b4905 Add AMMClawback Transaction (XLS-0073d) (#5142)
Amendment:
- AMMClawback

New Transactions:
- AMMClawback

Modified Transactions:
- AMMCreate
- AMMDeposit
2025-06-18 23:59:44 +09:00
Valentin Balaschenko
b3b00f66e4 docs: Add protobuf dependencies to linux setup instructions (#5156) 2025-06-18 23:48:28 +09:00
yinyiqian1
850dcc46d8 fix: reject invalid markers in account_objects RPC calls (#5046) 2025-06-18 23:48:09 +09:00
Bob Conan
ed729d96dd Update RELEASENOTES.md (#5154)
fix the typo "concensus" -> "consensus"
2025-06-18 23:41:27 +09:00
Gregory Tsipenyuk
69d24aff80 Introduce MPT support (XLS-33d): (#5143)
Amendment:
- MPTokensV1

New Transactions:
- MPTokenIssuanceCreate
- MPTokenIssuanceDestroy
- MPTokenIssuanceSet
- MPTokenAuthorize

Modified Transactions:
- Payment
- Clawback

New Objects:
- MPTokenIssuance
- MPToken

API updates:
- ledger_entry
- account_objects
- ledger_data

Other:
- Add += and -= operators to ValueProxy

Read full spec: https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0033d-multi-purpose-tokens

---------
Co-authored-by: Shawn Xie <shawnxie920@gmail.com>
Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
Co-authored-by: Ed Hennis <ed@ripple.com>
Co-authored-by: John Freeman <jfreeman08@gmail.com>
2025-06-18 23:38:51 +09:00
John Freeman
91499d2b3d Consolidate definitions of fields, objects, transactions, and features (#5122) 2025-06-18 16:26:50 +09:00
John Freeman
fbeb806ffd Reformat code with clang-format-18 2025-06-18 14:13:10 +09:00
John Freeman
0619030184 Update pre-commit hook 2025-06-18 13:30:15 +09:00
John Freeman
f7d7300f49 Update clang-format settings 2025-06-18 13:30:01 +09:00
John Freeman
516e44bcfd Update clang-format workflow 2025-06-18 13:29:29 +09:00
Chenna Keshava B S
5624e89058 Expand Error Message for rpcInternal (#4959)
Validator operators have been confused by the rpcInternal error, which can occur if the server is not running in another process.
2025-06-18 13:27:24 +09:00
Elliot Lee
4c9ec9732b docs: clean up API-CHANGELOG.md (#5064)
Move the newest information to the top, i.e., use reverse chronological order within each of the two sections ("API Versions" and "XRP Ledger server versions")
2025-06-18 13:27:14 +09:00
Denis Angell
c3ab35d588 feat(SQLite): allow configurable database pragma values (#5135)
Make page_size and journal_size_limit configurable values in rippled.cfg
2025-06-18 13:27:04 +09:00
Vlad
a474f116d6 refactor: re-order PRAGMA statements (#5140)
The page_size will soon be made configurable with #5135, making this
re-ordering necessary.

When opening SQLite connection, there are specific pragmas set with
commonPragmas.

In particular, PRAGMA journal_mode creates journal file and locks the
page_size; as of this commit, this sets the page size to the default
value of 4096. Coincidentally, the hardcoded page_size was also 4096, so
no issue was noticed.
2025-06-18 13:26:55 +09:00
Chenna Keshava B S
f9a3d843df fix(book_changes): add "validated" field and reduce RPC latency (#5096)
Update book_changes RPC to reduce latency, add "validated" field, and accept shortcut strings (current, closed, validated) for ledger_index.

`"validated": true` indicates that the transaction has been included in a validated ledger so the result of the transaction is immutable.

Fix #5033

Fix #5034

Fix #5035

Fix #5036

---------

Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
2025-06-18 13:26:45 +09:00
luozexuan
0f30e50543 chore: fix typos in comments (#5094)
Signed-off-by: luozexuan <fetchcode@139.com>
2025-06-18 13:26:37 +09:00
Ed Hennis
b279c7ecdd test: Retry RPC commands to try to fix MacOS CI jobs (#5120)
* Retry some failed RPC connections / commands in unit tests
* Remove orphaned `getAccounts` function

Co-authored-by: John Freeman <jfreeman08@gmail.com>
2025-06-18 13:26:21 +09:00
John Freeman
c07b47ed9a docs: Update options documentation (#5083)
Co-authored-by: Elliot Lee <github.public@intelliot.com>
2025-06-18 13:25:11 +09:00
John Freeman
4da341b418 refactor: Remove dead headers (#5081) 2025-06-18 13:24:50 +09:00
John Freeman
afdfdaaac2 refactor: Remove reporting mode (#5092) 2025-06-18 13:24:40 +09:00
Scott Schurr
4780daebea Address rare corruption of NFTokenPage linked list (#4945)
* Add fixNFTokenPageLinks amendment:

It was discovered that under rare circumstances the links between
NFTokenPages could be removed.  If this happens, then the
account_objects and account_nfts RPC commands under-report the
NFTokens owned by an account.

The fixNFTokenPageLinks amendment does the following to address
the problem:

- It fixes the underlying problem so no further broken links
  should be created.
- It adds Invariants so, if such damage were introduced in the
  future, an invariant would stop it.
- It adds a new FixLedgerState transaction that repairs
  directories that were damaged in this fashion.
- It adds unit tests for all of it.
2025-06-18 12:32:59 +09:00
Bronek Kozicki
811b971169 Factor out Transactor::trapTransaction (#5087) 2025-06-18 12:22:04 +09:00
John Freeman
6b43f4439a Remove shards (#5066) 2025-06-18 12:17:28 +09:00
Bronek Kozicki
452aa6ec17 Update gcovr EXCLUDE (#5084) 2025-06-17 23:58:07 +09:00
Bronek Kozicki
eccf3e1d83 Fix crash inside OverlayImpl loops over ids_ (#5071) 2025-06-17 23:57:58 +09:00
Ed Hennis
a490e669e4 docs: Document the process for merging pull requests (#5010) 2025-06-17 23:57:41 +09:00
Scott Schurr
3f4f3f0054 Remove unused constants from resource/Fees.h (#4856) 2025-06-17 23:57:02 +09:00
Mayukha Vadari
81465605f9 fix: change error for invalid feature param in feature RPC (#5063)
* Returns an "Invalid parameters" error if the `feature` parameter is provided and is not a string.
2025-06-17 23:56:54 +09:00
Ed Hennis
b494ba91a1 Ensure levelization sorting is ASCII-order across platforms (#5072) 2025-06-17 23:56:45 +09:00
Ed Hennis
91f48caad6 fix: Fix NuDB build error via Conan patch (#5061)
* Includes updated instructions in BUILD.md.
2025-06-17 23:56:34 +09:00
yinyiqian1
516e03ee49 Disallow filtering account_objects by unsupported types (#5056)
* `account_objects` returns an invalid field error if `type` is not supported.
  This includes objects an account can't own, or which are unsupported by `account_objects`
* Includes:
  * Amendments
  * Directory Node
  * Fee Settings
  * Ledger Hashes
  * Negative UNL
2025-06-17 23:56:15 +09:00
Scott Schurr
8cac140648 chore: Add comments to SignerEntries.h (#5059) 2025-06-17 23:49:42 +09:00
Scott Schurr
3f49da334b chore: Rename two files from Directory* to Dir*: (#5058)
The names of the files should reflect the name of the Dir class.

Co-authored-by: Zack Brunson <Zshooter@gmail.com>
Co-authored-by: Ed Hennis <ed@ripple.com>
2025-06-17 23:46:50 +09:00
Denis Angell
51f1279c69 Update BUILD.md after PR #5052 (#5067)
* Document the need to specify "xrpld" and "tests" to build and test rippled.
2025-06-17 22:55:08 +09:00
John Freeman
4e37f8d57b Add xrpld build option and Conan package test (#5052)
* Make xrpld target optional

* Add job to test Conan recipe

* [fold] address review comments

* [fold] Enable tests in workflows

* [fold] Rename with_xrpld option

* [fold] Fix grep expression
2025-06-17 22:54:49 +09:00
dashangcun
e53a34973d chore: remove repeat words (#5053)
Signed-off-by: dashangcun <jchaodaohang@foxmail.com>
Co-authored-by: dashangcun <jchaodaohang@foxmail.com>
Co-authored-by: Zack Brunson <Zshooter@gmail.com>
2025-06-17 22:42:57 +09:00
yinyiqian1
12d84f9747 fix CTID in tx command returns invalidParams on lowercase hex (#5049)
* fix CTID in tx command returns invalidParams on lowercase hex

* test mixed case and change auto to explicit type

* add header cctype because std::tolower is called

* remove unused local variable

* change test case comment from 'lowercase' to 'mixed case'

---------

Co-authored-by: Zack Brunson <Zshooter@gmail.com>
2025-06-17 22:42:43 +09:00
Ed Hennis
6862ca2035 Invariant: prevent a deleted account from leaving (most) artifacts on the ledger. (#4663)
* Add feature / amendment "InvariantsV1_1"

* Adds invariant AccountRootsDeletedClean:

* Checks that a deleted account doesn't leave any directly
  accessible artifacts behind.
* Always tests, but only changes the transaction result if
  featureInvariantsV1_1 is enabled.
* Unit tests.

* Resolves #4638

* [FOLD] Review feedback from @gregtatcam:

* Fix unused variable warning
* Improve Invariant test const correctness

* [FOLD] Review feedback from @mvadari:

* Centralize the account keylet function list, and some optimization

* [FOLD] Some structured binding doesn't work in clang

* [FOLD] Review feedback 2 from @mvadari:

* Clean up and clarify some comments.

* [FOLD] Change InvariantsV1_1 to unsupported

* Will allow multiple PRs to be merged over time using the same amendment.

* fixup! [FOLD] Change InvariantsV1_1 to unsupported

* [FOLD] Update and clarify some comments. No code changes.

* Move CMake directory

* Rearrange sources

* Rewrite includes

* Recompute loops

* Fix merge issue and formatting

---------

Co-authored-by: Pretty Printer <cpp@ripple.com>
2025-06-17 22:42:28 +09:00
yinyiqian1
101e3b7883 fix "account_nfts" with unassociated marker returning issue (#5045)
* fix "account_nfts" with unassociated marker returning issue

* create unit test for fixing nft page invalid marker not returning error

add more test

change test name

create unit test

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* [FOLD] accumulated review suggestions

* move BEAST check out of lambda function

---------

Authored-by: Scott Schurr <scott@ripple.com>
2025-06-17 22:38:25 +09:00
Scott Schurr
a4d8b8c6f9 fixInnerObjTemplate2 amendment (#5047)
* fixInnerObjTemplate2 amendment:

Apply inner object templates to all remaining (non-AMM)
inner objects.

Adds a unit test for applying the template to sfMajorities.
Other remaining inner objects showed no problems having
templates applied.

* Move CMake directory

* Rearrange sources

* Rewrite includes

* Recompute loops

---------

Co-authored-by: Pretty Printer <cpp@ripple.com>
2025-06-17 22:33:07 +09:00
tequ
c9a4a3d04c fix for current codebase 2025-06-17 21:45:05 +09:00
Pretty Printer
16d5e48940 Recompute loops 2025-06-17 20:32:54 +09:00
Pretty Printer
0f9f9e60e9 Rewrite includes 2025-06-17 20:32:41 +09:00
Pretty Printer
a05ee6fc64 Rearrange sources 2025-06-17 10:42:41 +00:00
Pretty Printer
6a643e5fb5 Rearrange sources 2025-06-17 19:16:40 +09:00
Pretty Printer
98c03c9091 Move CMake directory 2025-06-17 18:22:13 +09:00
John Freeman
118b9bdcf9 Add bin/physical.sh (#4997) 2025-06-17 17:44:53 +09:00
John Freeman
168a160a25 Prepare to rearrange sources: (#4997)
- Remove CMake module "MultiConfig".
- Update clang-format configuration, CodeCov configuration,
  levelization script.
- Replace source lists in CMake with globs.
2025-06-17 17:44:40 +09:00
Bronek Kozicki
57c3a80f48 Change order of checks in amm_info: (#4924)
* Change order of checks in amm_info

* Change amm_info error message in API version 3

* Change amm_info error tests
2025-06-17 13:31:33 +09:00
Scott Schurr
d566793e60 Add the fixEnforceNFTokenTrustline amendment: (#4946)
Fix interactions between NFTokenOffers and trust lines.

Since the NFTokenAcceptOffer does not check the trust line that
the issuer receives as a transfer fee in the NFTokenAcceptOffer,
if the issuer deletes the trust line after NFTokenCreateOffer,
the trust line is created for the issuer by the
NFTokenAcceptOffer.  That's fixed.

Resolves #4925.
2025-06-17 13:31:15 +09:00
Chenna Keshava B S
3dfbd80d96 Replaces the usage of boost::string_view with std::string_view (#4509) 2025-06-17 13:27:08 +09:00
Elliot Lee
1498c401e6 docs: explain how to find a clang-format patch generated by CI (#4521) 2025-06-17 13:20:21 +09:00
tequ
246c11a1e4 XLS-52d: NFTokenMintOffer (#4845) 2025-06-17 13:20:05 +09:00
todaymoon
bb2c1e77f8 chore: remove repeat words (#5041) 2025-06-17 12:54:28 +09:00
Alex Kremer
b47b915994 Expose all amendments known by libxrpl (#5026) 2025-06-17 12:54:18 +09:00
Scott Schurr
ac74dc0287 fixReducedOffersV2: prevent offers from blocking order books: (#5032)
Fixes issue #4937.

The fixReducedOffersV1 amendment fixed certain forms of offer
modification that could lead to blocked order books.  Reduced
offers can block order books if the effective quality of the
reduced offer is worse than the quality of the original offer
(from the perspective of the taker). It turns out that, for
small values, the quality of the reduced offer can be
significantly affected by the rounding mode used during
scaling computations.

Issue #4937 identified an additional code path that modified
offers in a way that could lead to blocked order books.  This
commit changes the rounding in that newly located code path so
the quality of the modified offer is never worse than the
quality of the offer as it was originally placed.

It is possible that additional ways of producing blocking
offers will come to light.  Therefore there may be a future
need for a V3 amendment.
2025-06-17 12:54:03 +09:00
Chenna Keshava B S
44a5bce661 Additional unit tests for testing deletion of trust lines (#4886) 2025-06-17 12:48:31 +09:00
Olek
1eb4ad0d55 Fix conan typo: (#5044)
Add missed coma in 'exportes_sources'
2025-06-17 12:48:06 +09:00
Bronek Kozicki
8fea47447c Add new command line option to make replaying transactions easier: (#5027)
* Add trap_tx_hash command line option

This new option can be used only if replay is also enabled. It takes a transaction hash from the ledger loaded for replay, and will cause a specific line to be hit in Transactor.cpp, right before the selected transaction is applied.
2025-06-17 12:47:45 +09:00
John Freeman
c59bd29b61 Fix compatibility with Conan 2.x: (#5001)
Closes #4926, #4990
2025-06-17 12:42:56 +09:00
J. Scott Branson
5c0cbc3a53 Update SQLite3 max_page_count to match current defaults (#5114)
When rippled initiates a connection to SQLite3, rippled sends a "PRAGMA"
statement defining the maximum number of pages allowed in the database.
Update the max_page_count so it is consistent with the default for newer
versions of SQLite3. Increasing max_page_count is critical for keeping
full history servers online.

Fix #5102
2025-06-17 12:32:25 +09:00
Valentin Balaschenko
5f7d2c22c1 Track latencies of certain code blocks, and log if they take too long 2025-06-17 12:32:25 +09:00
John Freeman
b8cecfaa05 Use error codes throughout fast Base58 implementation 2025-06-17 12:32:25 +09:00
Mayukha Vadari
b168945e66 Improve error handling in some RPC commands 2025-06-17 12:32:24 +09:00
Alex Kremer
62812267a9 Add xrpl.libpp as an exported lib in conan (#5022) 2025-06-17 12:32:24 +09:00
Gregory Tsipenyuk
930830efdf Fix Oracle's token pair deterministic order: (#5021)
Price Oracle data-series logic uses `unordered_map` to update the Oracle object.
This results in different servers disagreeing on the order of that hash table.
Consequently, the generated ledgers will have different hashes.
The fix uses `map` instead to guarantee the order of the token pairs
in the data-series.
2025-06-17 12:32:23 +09:00
Gregory Tsipenyuk
617be9b241 Fix last Liquidity Provider withdrawal:
Due to the rounding, LPTokenBalance of the last
Liquidity Provider (LP), might not match this LP's
trustline balance. This fix sets LPTokenBalance on
last LP withdrawal to this LP's LPToken trustline
balance.
2025-06-17 12:32:23 +09:00
Gregory Tsipenyuk
7a9f56adb6 Fix offer crossing via single path AMM with transfer fee:
Single path AMM offer has to factor in the transfer in rate
when calculating the upper bound quality and the quality function
because single path AMM's offer quality is not constant.
This fix factors in the transfer fee in
BookStep::adjustQualityWithFees().
2025-06-17 12:32:22 +09:00
Gregory Tsipenyuk
24b3ad94bb Fix adjustAmountsByLPTokens():
The fix is to return the actual adjusted lp tokens and amounts
by the function.
2025-06-17 12:32:22 +09:00
Gregory Tsipenyuk
611bb454e8 Add the fixAMMOfferRounding amendment: (#4983)
* Fix AMM offer rounding and low quality LOB offer blocking AMM:

A single-path AMM offer with account offer on DEX, is always generated
starting with the takerPays first, which is rounded up, and then
the takerGets, which is rounded down. This rounding ensures that the pool's
product invariant is maintained. However, when one of the offer's side
is XRP, this rounding can result in the AMM offer having a lower
quality, potentially causing offer generation to fail if the quality
is lower than the account's offer quality.

To address this issue, the proposed fix adjusts the offer generation process
to start with the XRP side first and always rounds it down. This results
in a smaller offer size, improving the offer's quality. Regardless if the offer
has XRP or not, the rounding is done so that the offer size is minimized.
This change still ensures the product invariant, as the other generated
side is the exact result of the swap-in or swap-out equations.

If a liquidity can be provided by both AMM and LOB offer on offer crossing
then AMM offer is generated so that it matches LOB offer quality. If LOB
offer quality is less than limit quality then generated AMM offer quality
is also less than limit quality and the offer doesn't cross. To address
this issue, if LOB quality is better than limit quality then use LOB
quality to generate AMM offer. Otherwise, don't use the quality to generate
AMM offer. In this case, limitOut() function in StrandFlow limits
the out amount to match strand's quality to limit quality and consume
maximum AMM liquidity.
2025-06-17 12:32:21 +09:00
Gregory Tsipenyuk
eaf963c800 Price Oracle: validate input parameters and extend test coverage: (#5013)
* Price Oracle: validate input parameters and extend test coverage:

Validate trim, time_threshold, document_id are valid
Int, UInt, or string convertible to UInt. Validate base_asset
and quote_asset are valid currency. Update error codes.
Extend Oracle and GetAggregatePrice unit-tests.
Denote unreachable coverage code.

* Set one-line LCOV_EXCL_LINE

* Move ledger_entry tests to LedgerRPC_test.cpp

* Add constants for "None"

* Fix LedgerRPC test

---------

Co-authored-by: Scott Determan <scott.determan@yahoo.com>
2025-06-17 12:32:21 +09:00
Michael Legleux
64f7981005 Add external directory to Conan recipe's exports (#5006) 2025-06-17 12:32:20 +09:00
John Freeman
94ed7c2f53 Add missing includes (#5011) 2025-06-17 12:32:20 +09:00
seelabs
17b0653da5 Remove flow assert: (#5009)
Rounding in the payment engine is causing an assert to sometimes fire
with "dust" amounts. This is causing issues when running debug builds of
rippled. This issue will be addressed, but the assert is no longer
serving its purpose.
2025-06-17 12:32:20 +09:00
seelabs
0f14487a48 fix amendment: AMM swap should honor invariants: (#5002)
The AMM has an invariant for swaps where:
new_balance_1*new_balance_2 >= old_balance_1*old_balance_2

Due to rounding, this invariant could sometimes be violated (although by
very small amounts).

This patch introduces an amendment `fixAMMRounding` that changes the
rounding to always favor the AMM. Doing this should maintain the
invariant.

Co-authored-by: Bronek Kozicki
Co-authored-by: thejohnfreeman
2025-06-17 12:32:19 +09:00
seelabs
36ca8e353e Add global access to the current ledger rules:
It can be difficult to make transaction breaking changes to low level
code because the low level code does not have access to a ledger and the
current activated amendments in that ledger (the "rules"). This patch
adds global access to the current ledger rules as a `std::optional`. If
the optional is not seated, then there is no active transaction.
2025-06-17 12:32:19 +09:00
Snoppy
a6f292c10e chore: fix typos (#4958) 2025-06-17 12:32:18 +09:00
Ed Hennis
a48d41d3b8 test: Add RPC error checking support to unit tests (#4987) 2025-06-17 12:32:18 +09:00
John Freeman
d11fac3dfb Ignore more commits 2025-06-17 12:32:17 +09:00
John Freeman
6d6c3d3561 Address compiler warnings 2025-06-17 12:32:17 +09:00
John Freeman
de4962639a Add markers around source lists 2025-06-17 12:32:16 +09:00
John Freeman
c89b3c47eb Fix source lists 2025-06-17 12:32:16 +09:00
Pretty Printer
b98f5195dc Rewrite includes
$ find src/ripple/ src/test/ -type f -exec sed -i 's:include\s*["<]ripple/\(.*\)\.h\(pp\)\?[">]:include <ripple/\1.h>:' {} +
2025-06-17 12:32:16 +09:00
Pretty Printer
591b7100c3 Format formerly .hpp files 2025-06-17 12:32:15 +09:00
Pretty Printer
59af6a9435 Rename .hpp to .h 2025-06-17 12:32:15 +09:00
John Freeman
c002db28aa Simplify protobuf generation 2025-06-17 12:32:14 +09:00
Pretty Printer
1200ea2121 Consolidate external libraries 2025-06-17 12:32:14 +09:00
John Freeman
d9be15ca5b Remove unused files 2025-06-17 12:32:13 +09:00
Ed Hennis
fa89de4280 fix: Remove redundant STAmount conversion in test (#4996) 2025-06-17 12:32:13 +09:00
Scott Determan
0ba2d3215c fix: resolve database deadlock: (#4989)
The `rotateWithLock` function holds a lock while it calls a callback
function that's passed in by the caller. This is a problematic design
that needs to be used very carefully. In this case, at least one caller
passed in a callback that eventually relocks the mutex on the same
thread, causing UB (a deadlock was observed). The caller was from
SHAMapStoreImpl, and it called `clearCaches`. This `clearCaches` can
potentially call `fetchNodeObject`, which tried to relock the mutex.

This patch resolves the issue by changing the mutex type to a
`recursive_mutex`. Ideally, the code should be rewritten so it doesn't
hold the mutex during the callback and the mutex should be changed back
to a regular mutex.

Co-authored-by: Ed Hennis <ed@ripple.com>
2025-06-17 12:32:12 +09:00
Michael Legleux
be56541d41 fix Conan component reference typo 2025-06-17 12:32:12 +09:00
Bronek Kozicki
682c8d0a05 Remove unused lambdas from MultiApiJson_test 2025-06-17 12:32:12 +09:00
Chenna Keshava B S
d99fc882da test: verify the rounding behavior of equal-asset AMM deposits (#4982)
* Specifically, test using tfLPToken flag
2025-06-17 12:32:11 +09:00
John Freeman
4c28672593 test: Add tests to raise coverage of AMM (#4971)
---------

Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
Co-authored-by: Mark Travis <mtravis@ripple.com>
Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
Co-authored-by: Mayukha Vadari <mvadari@gmail.com>
Co-authored-by: Chenna Keshava <ckeshavabs@gmail.com>
2025-06-17 12:32:11 +09:00
John Freeman
32857b6f0f test: Add tests to raise coverage of AMM (#4971)
---------

Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
Co-authored-by: Mark Travis <mtravis@ripple.com>
Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
Co-authored-by: Mayukha Vadari <mvadari@gmail.com>
Co-authored-by: Chenna Keshava <ckeshavabs@gmail.com>
2025-06-17 12:32:10 +09:00
Bronek Kozicki
f1b88faab1 test: Unit test for AMM offer overflow (#4986) 2025-06-17 12:32:10 +09:00
Mayukha Vadari
83196a3698 fix amendment to add PreviousTxnID/PreviousTxnLgrSequence (#4751)
This amendment, `fixPreviousTxnID`, adds `PreviousTxnID` and
`PreviousTxnLgrSequence` as fields to all ledger objects that did
not already have them included (`DirectoryNode`, `Amendments`,
`FeeSettings`, `NegativeUNL`, and `AMM`). This makes it much easier
to go through the history of these ledger objects.
2025-06-17 12:32:09 +09:00
Ed Hennis
144243235b chore: Default validator-keys-tool to master branch: (#4943)
* master is the default branch for that project. There's no point in
  using develop.
2025-06-17 12:32:09 +09:00
Scott Determan
d81f354927 fixXChainRewardRounding: round reward shares down: (#4933)
When calculating reward shares, the amount should always be rounded
down. If the `fixUniversalNumber` amendment is not active, this works
correctly. If it is not active, then the amount is incorrectly rounded
up. This patch introduces an amendment so it will be rounded down.
2025-06-17 12:32:08 +09:00
Mark Travis
002a6c4e2d Don't reach consensus as quickly if no other proposals seen: (#4763)
This fixes a case where a peer can desync under a certain timing
circumstance--if it reaches a certain point in consensus before it receives
proposals. 

This was noticed under high transaction volumes. Namely, when we arrive at the
point of deciding whether consensus is reached after minimum establish phase
duration but before having received any proposals. This could be caused by
finishing the previous round slightly faster and/or having some delay in
receiving proposals. Existing behavior arrives at consensus immediately after
the minimum establish duration with no proposals. This causes us to desync
because we then close a non-validated ledger. The change in this PR causes us to
wait for a configured threshold before making the decision to arrive at
consensus with no proposals. This allows validators to catch up and for brief
delays in receiving proposals to be absorbed. There should be no drawback since,
with no proposals coming in, we needn't be in a huge rush to jump ahead.
2025-06-17 12:32:08 +09:00
Bronek Kozicki
76eb878363 Write improved forAllApiVersions used in NetworkOPs (#4833) 2025-06-17 12:32:08 +09:00
Bronek Kozicki
7bbc81fd50 Enforce no duplicate slots from incoming connections: (#4944)
We do not currently enforce that incoming peer connection does not have
remote_endpoint which is already used (either by incoming or outgoing
connection), hence already stored in slots_. If we happen to receive a
connection from such a duplicate remote_endpoint, it will eventually result in a
crash (when disconnecting) or weird behavior (when updating slot state), as a
result of an apparently matching remote_endpoint in slots_ being used by a
different connection.
2025-06-17 12:32:07 +09:00
Mayukha Vadari
4b0b5be588 fixEmptyDID: fix amendment to handle empty DID edge case: (#4950)
This amendment fixes an edge case where an empty DID object can be
created. It adds an additional check to ensure that DIDs are
non-empty when created, and returns a `tecEMPTY_DID` error if the DID
would be empty.
2025-06-17 12:32:07 +09:00
Ed Hennis
032998a293 test: Env unit test RPC errors return a unique result: (#4877)
* telENV_RPC_FAILED is a new code, reserved exclusively
  for unit tests when RPC fails. This will
  make those types of errors distinct and easier to test
  for when expected and/or diagnose when not.
* Output RPC command result when result is not expected.
2025-06-17 12:32:06 +09:00
Bronek Kozicki
3bf260a977 Upgrade to xxhash 0.8.2 as a Conan requirement, enable SIMD hashing (#4893)
We are currently using old version 0.6.2 of `xxhash`, as a verbatim copy and paste of its header file `xxhash.h`. Switch to the more recent version 0.8.2. Since this version is in Conan Center (and properly protects its ABI by keeping the state object incomplete), add it as a Conan requirement. Switch to the SIMD instructions (in the new `XXH3` family) supported by the new version.
2025-06-17 12:32:06 +09:00
Michael Legleux
5fb16a38ad Install more public headers (#4940)
Fixes some mistakes in #4885
2025-06-17 12:32:05 +09:00
Scott Determan
c8b8d76744 fix: order book update variable swap: (#4890)
This is likely the result of a typo when the code was simplified.
2025-06-17 12:32:05 +09:00
John Freeman
96ef4a9421 Embed patched recipe for RocksDB 6.29.5 (#4947) 2025-06-17 12:32:04 +09:00
Gregory Tsipenyuk
93fe4c3dfd build: add STCurrency.h to xrpl_core to fix clio build (#4939) 2025-06-17 12:32:04 +09:00
Mayukha Vadari
fe5c971465 feat: add user version of feature RPC (#4781)
* uses same formatting as admin RPC
* hides potentially sensitive data
2025-06-17 12:32:04 +09:00
Scott Determan
e392b8316e Fast base58 codec: (#4327)
This algorithm is about an order of magnitude faster than the existing
algorithm (about 10x faster for encoding and about 15x faster for
decoding - including the double hash for the checksum). The algorithms
use gcc's int128 (fast MS version will have to wait, in the meantime MS
falls back to the slow code).
2025-06-17 12:32:03 +09:00
Chenna Keshava B S
3f971e44e8 Remove default ctors from SecretKey and PublicKey: (#4607)
* It is now an invariant that all constructed Public Keys are valid,
  non-empty and contain 33 bytes of data.
* Additionally, the memory footprint of the PublicKey class is reduced.
  The size_ data member is declared as static.
* Distinguish and identify the PublisherList retrieved from the local
  config file, versus the ones obtained from other validators.
* Fixes #2942
2025-06-17 12:32:03 +09:00
Gregory Tsipenyuk
7031e8cc87 fix compile error on gcc 13: (#4932)
The compilation fails due to an issue in the initializer list
of an optional argument, which holds a vector of pairs.
The code compiles correctly on earlier gcc versions, but fails on gcc 13.
2025-06-17 12:32:02 +09:00
Gregory Tsipenyuk
50f8ecc3e6 Price Oracle (XLS-47d): (#4789) (#4789)
Implement native support for Price Oracles.

 A Price Oracle is used to bring real-world data, such as market prices,
 onto the blockchain, enabling dApps to access and utilize information
 that resides outside the blockchain.

 Add Price Oracle functionality:
 - OracleSet: create or update the Oracle object
 - OracleDelete: delete the Oracle object

 To support this functionality add:
 - New RPC method, `get_aggregate_price`, to calculate aggregate price for a token pair of the specified oracles
 - `ltOracle` object

 The `ltOracle` object maintains:
 - Oracle Owner's account
 - Oracle's metadata
 - Up to ten token pairs with the scaled price
 - The last update time the token pairs were updated

 Add Oracle unit-tests
2025-06-17 12:32:02 +09:00
Mayukha Vadari
897ab2662c feat(rpc): add server_definitions method (#4703)
Add a new RPC / WS call for `server_definitions`, which returns an
SDK-compatible `definitions.json` (binary enum definitions) generated by
the server. This enables clients/libraries to dynamically work with new
fields and features, such as ones that may become available on side
chains. Clients query `server_definitions` on a node from the network
they want to work with, and immediately know how to speak that node's
binary "language", even if new features are added to it in the future
(as long as there are no new serialized types that the software doesn't
know how to serialize/deserialize).

Example:

```js
> {"command": "server_definitions"}
< {
    "result": {
        "FIELDS": [
            [
                "Generic",
                {
                    "isSerialized": false,
                    "isSigningField": false,
                    "isVLEncoded": false,
                    "nth": 0,
                    "type": "Unknown"
                }
            ],
            [
                "Invalid",
                {
                    "isSerialized": false,
                    "isSigningField": false,
                    "isVLEncoded": false,
                    "nth": -1,
                    "type": "Unknown"
                }
            ],
            [
                "ObjectEndMarker",
                {
                    "isSerialized": false,
                    "isSigningField": true,
                    "isVLEncoded": false,
                    "nth": 1,
                    "type": "STObject"
                }
            ],
        ...
```

Close #3657

---------

Co-authored-by: Richard Holland <richard.holland@starstone.co.nz>
2025-06-17 12:32:01 +09:00
Gregory Tsipenyuk
41f6adcf6c fix: improper handling of large synthetic AMM offers:
A large synthetic offer was not handled correctly in the payment engine.
This patch fixes that issue and introduces a new invariant check while
processing synthetic offers.
2025-06-17 12:18:08 +09:00
Ed Hennis
afe84c4fae test: guarantee proper lifetime for temporary Rules object: (#4917)
* Commit 01c37fe introduced a change to the STTx unit test where a local
  "defaultRules" object was created with a temporary inline "presets"
  value provided to the ctor. Rules::Impl stores a const ref to the
  presets provided to the ctor.  This particular call provided an inline
  temp variable, which goes out of scope as soon as the object is
  created. On Windows, attempting to use the presets (e.g. via the
  enabled() function) causes an access violation, which crashes the test
  run.
* An audit of the code indicates that all other instances of Rules use
  the Application's config.features list, which will have a sufficient
  lifetime.
2025-06-17 12:17:12 +09:00
Gregory Tsipenyuk
0e5450b987 fixInnerObjTemplate: set inner object template (#4906)
Add `STObject` constructor to explicitly set the inner object template.
This allows certain AMM transactions to apply in the same ledger:

There is no issue if the trading fee is greater than or equal to 0.01%.
If the trading fee is less than 0.01%, then:
- After AMM create, AMM transactions must wait for one ledger to close
  (3-5 seconds).
- After one ledger is validated, all AMM transactions succeed, as
  appropriate, except for AMMVote.
- The first AMMVote which votes for a 0 trading fee in a ledger will
  succeed. Subsequent AMMVote transactions which vote for a 0 trading
  fee will wait for the next ledger (3-5 seconds). This behavior repeats
  for each ledger.

This has no effect on the ultimate correctness of AMM. This amendment
will allow the transactions described above to succeed as expected, even
if the trading fee is 0 and the transactions are applied within one
ledger (block).
2025-06-17 12:17:12 +09:00
Chenna Keshava B S
ea81e5337d feat: allow port_grpc to be specified in [server] stanza (#4728)
Prior to this commit, `port_grpc` could not be added to the [server]
stanza. Instead of validating gRPC IP/Port/Protocol information in
ServerHandler, validate grpc port info in GRPCServer constructor. This
should not break backwards compatibility.

gRPC-related config info must be in a section (stanza) called
[port_gprc].

* Close #4015 - That was an alternate solution. It was decided that with
  relaxed validation, it is not necessary to rename port_grpc.
* Fix #4557
2025-06-17 12:17:11 +09:00
Michael Legleux
f979fc7d5a build: add headers needed in Conan package for libxrpl (#4885)
These headers are required in the xrpl Conan package in order for
xbridge witness server (xbwd) to build. This change to libxrpl may help
any dependents of libxrpl. This addition does not change any C++ code.
2025-06-17 12:17:11 +09:00
Shawn Xie
e20b2e83f8 fixNFTokenReserve: ensure NFT tx fails when reserve is not met (#4767)
Without this amendment, an NFTokenAcceptOffer transaction can succeed
even when the NFToken recipient does not have sufficient reserves for
the new NFTokenPage. This allowed accounts to accept NFT sell offers
without having a sufficient reserve. (However, there was no issue in
brokered mode or when a buy offer is involved.)

Instead, the transaction should fail with `tecINSUFFICIENT_RESERVE` as
appropriate. The `fixNFTokenReserve` amendment adds checks in the
NFTokenAcceptOffer transactor to check if the OwnerCount changed. If it
did, then it checks the new reserve requirement.

Fix #4679
2025-06-17 12:17:10 +09:00
tequ
4c3f41d176 bad merge: RPCCall_test, Transaction_test 2025-06-17 12:17:10 +09:00
Ed Hennis
f911735766 Fix cahce bug introduced in 2.0.1
Partially chery-picked from f419c18056
2025-06-17 12:17:09 +09:00
John Freeman
40fd4f05b4 fix(libxrpl): change library names in Conan recipe (#4831)
Use consistent platform-agnostic library names on all platforms.

Fix an issue that prevents dependents like validator-keys-tool from
linking to libxrpl on Windows.

It is bad practice to change the binary base name depending on the
platform. CMake already manipulates the base name into a final name that
fits the conventions of the platform. Linkers accept base names on the
command line and then look for conventional names on disk.
2025-06-17 12:17:09 +09:00
Bronek Kozicki
058bf2113d test: add unit test for redundant payment (#4860)
If the payee and payer are the same account, then the transaction fails
in preflight with temREDUNDANT.
2025-06-17 12:17:09 +09:00
Bronek Kozicki
8208bc66b2 test: improve code coverage reporting (#4849)
* Speed up the generation of coverage reports by using multiple cores.

* Add codecov step to coverage workflow.
2025-06-17 12:17:05 +09:00
Chenna Keshava B S
d187b1542b docs: update help message about unit test-suite pattern matching (#4846)
Update the "rippled --help" message for the "-u" parameter. This
documents the unit test name pattern matching rule implemented by #4634.

Fix #4800
2025-06-17 12:16:14 +09:00
Elliot Lee
61712e8498 docs: add Performance type to PR template (#4875) 2025-06-17 12:16:14 +09:00
Bronek Kozicki
f97b465d5e test: add DeliverMax to more JSONRPC tests (#4826)
Minor change in unit tests to improve testing scope.
2025-06-17 12:16:13 +09:00
John Freeman
501b6140ed fix: change default send_queue_limit to 500 (#4867)
Clients subscribed to `transactions` over WebSocket are being
disconnected because the traffic exceeds the default `send_queue_limit`
of 100.

This commit changes the default configuration, not the default in code.

Fix #4866
2025-06-17 12:16:13 +09:00
Ed Hennis
3d32ad9edb Improve lifetime management of ledger objects (SLEs) to prevent runaway memory usage: (#4822)
* Add logging for Application.cpp sweep()
* Improve lifetime management of ledger objects (`SLE`s)
* Only store SLE digest in CachedView; get SLEs from CachedSLEs
* Also force release of last ledger used for path finding if there are
  no path finding requests to process
* Count more ST objects (derive from `CountedObject`)
* Track CachedView stats in CountedObjects
* Rename the CachedView counters
* Fix the scope of the digest lookup lock

Before this patch, if you asked "is it caching?" It was always caching.
2025-06-17 12:16:13 +09:00
Ed Hennis
6dec5321f0 WebSocket should only call async_close once (#4848)
Prevent WebSocket connections from trying to close twice.

The issue only occurs in debug builds (assertions are disabled in
release builds, including published packages), and when the WebSocket
connections are unprivileged. The assert (and WRN log) occurs when a
client drives up the resource balance enough to be forcibly disconnected
while there are still messages pending to be sent.

Thanks to @lathanbritz for discovering this issue in #4822.
2025-06-17 12:16:12 +09:00
Hussein Badakhchani
1de8453d11 fix typo: 'of' instead of 'on' (#4821)
Co-authored-by: Hussein Badakhchani <hoos@alsoug.com>
2025-06-17 12:16:12 +09:00
Bronek Kozicki
f4c2f456b9 Workarounds for gcc-13 compatibility (#4817)
Workaround for compilation errors with gcc-13 and other compilers
relying on `libstdc++` version 13. This is temporary until actual fix is
available for us to use: https://github.com/boostorg/beast/pull/2682

Some boost.beast files (which we do use) rely on an old gcc-12 behaviour
where `#include <cstdint>` was not needed even though types from this
header were used. This was broken by a change in libstdc++ version 13:
https://gcc.gnu.org/gcc-13/porting_to.html#header-dep-changes

The necessary fix was implemented in boost.beast, however it is not yet
available. Until it is available, we can use this workaround to enable
compilation of `rippled` with gcc-13, clang-16, etc.
2025-06-17 12:16:11 +09:00
Bronek Kozicki
1639a9af3a APIv2: show DeliverMax in submit, submit_multisigned (#4827)
Show `DeliverMax` instead of `Amount` in output from `submit`,
`submit_multisigned`, `sign`, and `sign_for`.

Fix #4829
2025-06-17 12:16:11 +09:00
Bronek Kozicki
65ccb590e6 APIv2: consistently return ledger_index as integer (#4820)
For api_version 2, always return ledger_index as integer in JSON output.

api_version 1 retains prior behavior.
2025-06-17 12:16:10 +09:00
Bronek Kozicki
cfa9fa75e0 Fix 2.0 regression in tx method with binary output (#4812)
* Fix binary output from tx method

* Formatting fix

* Minor test improvement

* Minor test improvements
2025-06-17 12:16:10 +09:00
Bronek Kozicki
c0acaff911 Promote API version 2 to supported (#4803)
* Promote API version 2 to supported

* Switch command line to API version 1

* Fix LedgerRequestRPC test

* Remove obsolete tx_account method

This method is not implemented, the only parts which are removed are related to command-line parsing

* Fix RPCCall test

* Reduce diff size, small test improvements

* Minor fixes

* Support for the mold linker

* [fold] handle case where both mold and gold are installed

* [fold] Use first non-default linker

* Fix TransactionEntry_test

* Fix AccountTx_test

---------

Co-authored-by: seelabs <scott.determan@yahoo.com>
2025-06-17 12:16:10 +09:00
Scott Determan
c9af7f02cb Support for the mold linker (#4807) 2025-06-17 12:16:09 +09:00
Bronek Kozicki
6e39931b33 Unify JSON serialization format of transactions (#4775)
* Remove include <ranges>

* Formatting fix

* Output for subscriptions

* Output from sign, submit etc.

* Output from ledger

* Output from account_tx

* Output from transaction_entry

* Output from tx

* Store close_time_iso in API v2 output

* Add small APIv2 unit test for subscribe

* Add unit test for transaction_entry

* Add unit test for tx

* Remove inLedger from API version 2

* Set ledger_hash and ledger_index

* Move isValidated from RPCHelpers to LedgerMaster

* Store closeTime in LedgerFill

* Time formatting fix

* additional tests for Subscribe unit tests

* Improved comments

* Rename mInLedger to mLedgerIndex

* Minor fixes

* Set ledger_hash on closed ledger, even if not validated

* Update API-CHANGELOG.md

* Add ledger_hash, ledger_index to transaction_entry

* Fix validated and close_time_iso in account_tx

* Fix typos

* Improve getJson for Transaction and STTx

* Minor improvements

* Replace class enum JsonOptions with struct

We may consider turning this into a general-purpose template and using it elsewhere

* simplify the extraction of transactionID from Transaction object

* Remove obsolete comments

* Unconditionally set validated in account_tx output

* Minor improvements

* Minor fixes

---------

Co-authored-by: Chenna Keshava <ckeshavabs@gmail.com>
2025-06-17 12:16:09 +09:00
Scott Determan
38e5bc9a15 fix: check for valid public key in attestations (#4798) 2025-06-17 12:16:08 +09:00
pwang200
4265a50771 Fix unit test api_version to enable api_version 2 (#4785)
The command line API still uses `apiMaximumSupportedVersion`.
The unit test RPCs use `apiMinimumSupportedVersion` if unspecified.

Context:
- #4568
- #4552
2025-06-17 12:16:08 +09:00
Gregory Tsipenyuk
71094bdde7 fixFillOrKill: fix offer crossing with tfFillOrKill (#4694)
Introduce the `fixFillOrKill` amendment.

Fix an edge case occurring when an offer with `tfFillOrKill` set (but
without `tfSell` set) fails to cross an offer with a better rate. If
`tfFillOrKill` is set, then the owner must receive the full TakerPays.
Without this amendment, an offer fails if the entire `TakerGets` is not
spent. With this amendment, when `tfSell` is not set, the entire
`TakerGets` does not have to be spent.

For details about OfferCreate, see: https://xrpl.org/offercreate.html

Fix #4684

---------

Co-authored-by: Scott Schurr <scott@ripple.com>
2025-06-17 12:16:07 +09:00
Bronek Kozicki
4846a7a721 fix: remove include <ranges> (#4788)
Remove dependency on `<ranges>` header, since it is not implemented by
all compilers which we want to support.

This code change only affects unit tests.

Resolve https://github.com/XRPLF/rippled/issues/4787
2025-06-17 12:16:07 +09:00
Bronek Kozicki
611282e562 APIv2: remove tx_history and ledger_header (#4759)
Remove `tx_history` and `ledger_header` methods from API version 2.

Update `RPC::Handler` to allow for methods (or method implementations)
to be API version specific. This partially resolves #4727. We can now
store multiple handlers with the same name, as long as they belong to
different (non-overlapping) API versions. This necessarily impacts the
handler lookup algorithm and its complexity; however, there is no
performance loss on x86_64 architecture, and only minimal performance
loss on arm64 (around 10ns). This design change gives us extra
flexibility evolving the API in the future, including other parts of

In API version 2, `tx_history` and `ledger_header` are no longer
recognised; if they are called, `rippled` will return error
`unknownCmd`

Resolve #3638

Resolve #3539
2025-06-17 12:16:06 +09:00
Mark Travis
ad99a218e6 docs: clarify definition of network health (#4729)
Update the documentation to describe network health with more nuance as
well as context about related factors.
2025-06-17 12:16:06 +09:00
tequ
cfa30146b5 fix temCode: bad merge 2025-06-17 12:16:06 +09:00
Bronek Kozicki
bb5e437aa4 APIv2(DeliverMax): add alias for Amount in Payment transactions (#4733)
Using the "Amount" field in Payment transactions can cause incorrect
interpretation. There continue to be problems from the use of this
field. "Amount" is rarely the correct field to use; instead,
"delivered_amount" (or "DeliveredAmount") should be used.

Rename the "Amount" field to "DeliverMax", a less misleading name. With
api_version: 2, remove the "Amount" field from Payment transactions.

- Input: "DeliverMax" in `tx_json` is an alias for "Amount"
  - sign
  - submit (in sign-and-submit mode)
  - submit_multisigned
  - sign_for
- Output: Add "DeliverMax" where transactions are provided by the API
  - ledger
  - tx
  - tx_history
  - account_tx
  - transaction_entry
  - subscribe (transactions stream)
- Output: Remove "Amount" from API version 2

Fix #3484

Fix #3902
2025-06-17 12:16:05 +09:00
Mayukha Vadari
006048fd31 DID: Decentralized identifiers (DIDs) (XLS-40): (#4636)
Implement native support for W3C DIDs.

Add a new ledger object: `DID`.

Add two new transactions:
1. `DIDSet`: create or update the `DID` object.
2. `DIDDelete`: delete the `DID` object.

This meets the requirements specified in the DID v1.0 specification
currently recommended by the W3C Credentials Community Group.

The DID format for the XRP Ledger conforms to W3C DID standards.
The objects can be created and owned by any XRPL account holder.
The transactions can be integrated by any service, wallet, or application.
2025-06-17 12:16:05 +09:00
Scott Schurr
c478503463 refactor(peerfinder): use LogicError in PeerFinder::Logic (#4562)
It might be possible for the server code to indirect through certain
`end()` iterators. While a debug build would catch this problem with
`assert()`s, a release build would crash. If there are problems in this
area in the future, it is best to get a definitive indication of the
nature of the error regardless of whether it's a debug or release build.
To accomplish this, these `assert`s are converted into `LogicError`s
that will produce a reasonable error message when they fire.
2025-06-17 12:16:04 +09:00
Ed Hennis
1fbc16ec29 fix(PathRequest): remove incorrect assert (#4743)
The assert is saying that the only reason `pathFinder` would be null is
if the request was aborted (connection dropped, etc.). That's what
`continueCallback()` checks. But that is very clearly not true if you
look at `getPathFinder`, which calls `findPaths`, which can return false
for many reasons.

Fix #4744
2025-06-17 12:16:04 +09:00
Ed Hennis
ad82cd7529 docs(API-CHANGELOG): add XRPFees change (#4741)
* Add a new API Changelog section for release 1.10.
* Mark `jss::fee_ref` as deprecated.
* Fix a copy-paste error in one of the unit tests.
2025-06-17 12:16:03 +09:00
Florent
289d241c2c docs(rippled-example.cfg): add P2P link compression (#4753)
P2P link compression is a feature added in 1.6.0 by #3287.

https://xrpl.org/enable-link-compression.html

If the default changes in the future - for example, as currently
proposed by #4387 - the comment will be updated at that time.

Fix #4656
2025-06-17 12:16:03 +09:00
Denis Angell
b23a808d66 fixDisallowIncomingV1: allow issuers to authorize trust lines (#4721)
Context: The `DisallowIncoming` amendment provides an option to block
incoming trust lines from reaching your account. The
asfDisallowIncomingTrustline AccountSet Flag, when enabled, prevents any
incoming trust line from being created. However, it was too restrictive:
it would block an issuer from authorizing a trust line, even if the
trust line already exists. Consider:

1. Issuer sets asfRequireAuth on their account.
2. User sets asfDisallowIncomingTrustline on their account.
3. User submits tx to SetTrust to Issuer.

At this point, without `fixDisallowIncomingV1` active, the issuer would
not be able to authorize the trust line.

The `fixDisallowIncomingV1` amendment, once activated, allows an issuer
to authorize a trust line even after the user sets the
asfDisallowIncomingTrustline flag, as long as the trust line already
exists.
2025-06-17 12:16:03 +09:00
Scott Determan
033da04dd3 refactor: reduce boilerplate in applySteps: (#4710)
When a new transactor is added, there are several places in applySteps
that need to be modified. This patch refactors the code so only one
function needs to be modified.
2025-06-17 12:16:02 +09:00
Rome Reginelli
b974249149 refactor: reunify transaction common fields: (#4715)
Make transactions and pseudo-transactions share the same commonFields
again. This regularizes the code in a nice way.

While this technically allows pseudo-transactions to have a
TicketSequence field, pseudo-transactions are only ever constructed by
code paths that don't add such a field, so this is not a transaction
processing change. It may be possible to add a separate check to ensure
TicketSequence (and other fields that don't make sense on
pseudo-transactions) are never added to pseudo-transactions, but that
should not be necessary. (TicketSequence is not the only common field
that can not and does not appear in pseudo-transactions.) Note:
TicketSequence is already documented as a common field.

Related: #4637

Fix #4714
2025-06-17 12:16:02 +09:00
Chenna Keshava B S
b44542b24a docs(BUILD.md): require GCC 11 or higher (#4700)
Update minimum compiler requirement for building the codebase. The
feature "using enum" is required. This feature was introduced in C++20.

Updating the C++ compiler to version 11 or later fixes this error:

```
Building CXX object CMakeFiles/xrpl_core.dir/src/ripple/protocol/impl/STAmount.cpp.o
/build/ripple/binary/src/ripple/protocol/impl/STAmount.cpp: In lambda function:
/build/ripple/binary/src/ripple/protocol/impl/STAmount.cpp:1577:15: error: expected nested-name-specifier before 'enum'
 1577 |         using enum Number::rounding_mode;
      |               ^~~~
```

Fix #4693
2025-06-17 12:16:01 +09:00
Scott Determan
e8b707ce44 fix(XLS-38): disallow the same bridge on one chain: (#4720)
Modify the `XChainBridge` amendment.

Before this patch, two door accounts on the same chain could could own
the same bridge spec (of course, one would have to be the issuer and one
would have to be the locker). While this is silly, it does not violate
any bridge invariants. However, on further review, if we allow this then
the `claim` transactions would need to change. Since it's hard to see a
use case for two doors to own the same bridge, this patch disallows
it. (The transaction will return tecDUPLICATE).
2025-06-17 12:16:01 +09:00
Scott Schurr
796ed21a5e fix: stabilize voting threshold for amendment majority mechanism (#4410)
Amendment "flapping" (an amendment repeatedly gaining and losing
majority) usually occurs when an amendment is on the verge of gaining
majority, and a validator not in favor of the amendment goes offline or
loses sync. This fix makes two changes:

1. The number of validators in the UNL determines the threshold required
   for an amendment to gain majority.
2. The AmendmentTable keeps a record of the most recent Amendment vote
   received from each trusted validator (and, with `trustChanged`, stays
   up-to-date when the set of trusted validators changes). If no
   validation arrives from a given validator, then the AmendmentTable
   assumes that the previously-received vote has not changed.

In other words, when missing an `STValidation` from a remote validator,
each server now uses the last vote seen. There is a 24 hour timeout for
recorded validator votes.

These changes do not require an amendment because they do not impact
transaction processing, but only the threshold at which each individual
validator decides to propose an EnableAmendment pseudo-transaction.

Fix #4350
2025-06-17 12:16:00 +09:00
Ed Hennis
267e9a6484 fix(build): uint is not defined on Windows platform (#4731)
Fix the Windows build by using `unsigned int` (instead of `uint`).

The error, introduced by #4618, looks something like:
  rpc\impl\RPCHelpers.h(299,5): error C2061: syntax error: identifier
  'uint' (compiling source file app\ledger\Ledger.cpp)
2025-06-17 12:16:00 +09:00
Nik Bougalis
1908323f74 Eliminate the built-in SNTP support (fixes #4207): (#4628) 2025-06-17 12:15:59 +09:00
John Freeman
ff4bf70a06 fix: accept all valid currency codes in API (#4566)
A few methods, including `book_offers`, take currency codes as
parameters. The XRPL doesn't care if the letters in those codes are
lowercase or uppercase, as long as they come from an alphabet defined
internally. rippled doesn't care either, when they are submitted in a
hex representation. When they are submitted in an ASCII string
representation, rippled, but not XRPL, is more restrictive, preventing
clients from interacting with some currencies already in the XRPL.

This change gets rippled out of the way and lets clients submit currency
codes in ASCII using the full alphabet.

Fixes #4112
2025-06-17 12:15:59 +09:00
Bronek Kozicki
8a0dc118af chore: add .build to .gitignore (#4722)
Currently, the `BUILD.md` instructions suggest using `.build` as the
build directory, so this change helps to reduce confusion.

An alternative would be to instruct developers to add `/.build/` to
`.git/info/exclude` or to user-level `.gitignore` (although the latter
is very intrusive). However, it is being added here because it is a good
practice to have a sensible default that's consistent with the build
instructions.
2025-06-17 12:15:59 +09:00
ForwardSlashBack
26977482eb Fix typo in BUILD.md (#4718)
Co-authored-by: Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
2025-06-17 12:15:58 +09:00
Peter Chen
32f74d6a8d APIv2(gateway_balances, channel_authorize): update errors (#4618)
gateway_balances
* When `account` does not exist in the ledger, return `actNotFound`
  * (Previously, a normal response was returned)
  * Fix #4290
* When required field(s) are missing, return `invalidParams`
  * (Previously, `invalidHotWallet` was incorrectly returned)
  * Fix #4548

channel_authorize
* When the specified `key_type` is invalid, return `badKeyType`
  * (Previously, `invalidParams` was returned)
  * Fix #4289

Since these are breaking changes, they apply only to API version 2.

Supersedes #4577
2025-06-17 12:15:58 +09:00
John Freeman
b75ad6d57e build: use Boost 1.82 and link Boost.Json (#4632)
Add Boost::json to the list of linked Boost libraries.

This seems to be required for macOS.
2025-06-17 12:15:57 +09:00
Chenna Keshava B S
82d037b1e2 docs(overlay): add URL of blog post and clarify wording (#4635) 2025-06-17 12:15:57 +09:00
Elliot Lee
7b2d2b5165 docs(RELEASENOTES): update 1.12.0 notes to match dev blog (#4691)
* Reorganize some changelog entries
* Add note about portable binaries
* Dev blog: https://xrpl.org/blog
2025-06-17 12:15:56 +09:00
John Freeman
d9f459b90e Update secp256k1 to 0.3.2 (#4653)
Copy the new code to `src/secp256k1` without changes:
`src/secp256k1` is identical to bitcoin-core/secp256k1@acf5c55 (v0.3.2).

We could consider changing to a Git submodule, though that would require
changes to the build instructions because we are not using submodules
anywhere else.
2025-06-17 12:15:56 +09:00
Chenna Keshava B S
fbaf5fd1fb docs: fix comment for LedgerHistory::fixIndex return value (#4574)
`LedgerHistory::fixIndex` returns `false` if a repair was performed.

Fix #4572
2025-06-17 12:15:55 +09:00
Ed Hennis
01f1994203 fix: remove unused variable causing clang 14 build errors (#4672)
Removed the unused variable `none` from `Writer.cpp` which was causing
build errors on clang version 14.
2025-06-17 12:15:55 +09:00
Elliot Lee
792115204b docs(BUILD): make it easier to find environment.md (#4507)
Make the instructions a bit easier to follow. Users on different
platforms can look for their platform name to find relevant information.
2025-06-17 12:15:55 +09:00
Michael Legleux
3333656987 Revert CMake changes (#4707)
This was likely put back when #4292 was rebased.
2025-06-17 12:15:54 +09:00
Scott Determan
b600c7d03f Change XChainBridge amendment to Supported::yes (#4709) 2025-06-17 12:15:54 +09:00
Scott Determan
fcdc64fa40 Fix Windows build by removing two unused declarations (#4708)
Remove the `verify` and `message` function declarations. The explicit
instantiation requests could not be completed because there were no
implementations for those two member functions. It is helpful that the
Microsoft (MSVC) compiler on Windows appears to be strict when it comes
to template instantiation.

This resolves the warning:

  XChainAttestations.h(450): warning C4661: 'bool
  ripple::XChainAttestationsBase<ripple::XChainClaimAttestation>::verify(void)
  const': no suitable definition provided for explicit template
  instantiation request
2025-06-17 12:15:53 +09:00
Ed Hennis
80ba15e31c Match unit tests on start of test name (#4634)
* For example, without this change, to run the TxQ tests, must specify
  `--unittest=TxQ1,TxQ2` on the command line. With this change, can use
  `--unittest=TxQ`, and both will be run.
* An exact match will prevent any further partial matching.
* This could have some side effects for different tests with a common
  name beginning. For example, NFToken, NFTokenBurn, NFTokenDir. This
  might be useful. If not, the shorter-named test(s) can be renamed. For
  example, NFToken to NFTokens.
* Split the NFToken, NFTokenBurn, and Offer test classes. Potentially speeds
  up parallel tests by a factor of 5.
2025-06-17 12:15:53 +09:00
Howard Hinnant
8a2643d06b Revert ThreadName due to problems on Windows (#4702)
* Revert "Remove CurrentThreadName.h from RippledCore.cmake (#4697)"

This reverts commit 3b5fcd587313f5ebc762bc21c6a4ec3e6c275e83.

* Revert "Introduce replacement for getting and setting thread name: (#4312)"

This reverts commit 36cb5f90e233f975eb3f80d819b2fbadab0a9387.
2025-06-17 12:15:52 +09:00
Scott Determan
5b671c2ad2 XChainBridge: Introduce sidechain support (XLS-38): (#4292)
A bridge connects two blockchains: a locking chain and an issuing
chain (also called a mainchain and a sidechain). Both are independent
ledgers, with their own validators and potentially their own custom
transactions. Importantly, there is a way to move assets from the
locking chain to the issuing chain and a way to return those assets from
the issuing chain back to the locking chain: the bridge. This key
operation is called a cross-chain transfer. A cross-chain transfer is
not a single transaction. It happens on two chains, requires multiple
transactions, and involves an additional server type called a "witness".

A bridge does not exchange assets between two ledgers. Instead, it locks
assets on one ledger (the "locking chain") and represents those assets
with wrapped assets on another chain (the "issuing chain"). A good model
to keep in mind is a box with an infinite supply of wrapped assets.
Putting an asset from the locking chain into the box will release a
wrapped asset onto the issuing chain. Putting a wrapped asset from the
issuing chain back into the box will release one of the existing locking
chain assets back onto the locking chain. There is no other way to get
assets into or out of the box. Note that there is no way for the box to
"run out of" wrapped assets - it has an infinite supply.

Co-authored-by: Gregory Popovitch <greg7mdp@gmail.com>
2025-06-17 12:15:52 +09:00
Peter Chen
eede8c4021 APIv2(account_tx, noripple_check): return error on invalid input (#4620)
For the `account_tx` and `noripple_check` methods, perform input
validation for optional parameters such as "binary", "forward",
"strict", "transactions". Previously, when these parameters had invalid
values (e.g. not a bool), no error would be returned. Now, it returns an
`invalidParams` error.

* This updates the behavior to match Clio
  (https://github.com/XRPLF/clio).
* Since this is potentially a breaking change, it only applies to
  requests specifying api_version: 2.
* Fix #4543.
2025-06-17 12:15:52 +09:00
Howard Hinnant
5145adfe51 Remove CurrentThreadName.h from RippledCore.cmake (#4697)
(File was already removed from the source)
2025-06-17 12:15:51 +09:00
Mayukha Vadari
ef761b5a68 refactor: simplify TxFormats common fields logic (#4637)
Minor refactor to `TxFormats.cpp`:
- Rename `commonFields` to `pseudoCommonFields` (since it is the common fields
  that all pseudo-transactions need)
- Add a new static variable, `commonFields`, which represents all the common
  fields that non-pseudo transactions need (essentially everything that
  `pseudoCommonFields` contains, plus `sfTicketSequence`)

This makes it harder to accidentally leave out `sfTicketSequence` in a new
transaction.
2025-06-17 12:15:51 +09:00
Peter Chen
81418d03fc APIv2(ledger_entry): return invalidParams for bad parameters (#4630)
- Verify "check", used to retrieve a Check object, is a string.
- Verify "nft_page", used to retrieve an NFT Page, is a string.
- Verify "index", used to retrieve any type of ledger object by its
  unique ID, is a string.
- Verify "directory", used to retrieve a DirectoryNode, is a string or
  an object.

This change only impacts api_version 2 since it is a breaking change.

https://xrpl.org/ledger_entry.html

Fix #4550
2025-06-17 12:15:50 +09:00
Mark Pevec
6bb2e67743 docs(rippled-example.cfg): clarify ssl_cert vs ssl_chain (#4667)
Clarify usage of ssl_cert vs ssl_chain
2025-06-17 12:15:50 +09:00
Howard Hinnant
8d63153689 Introduce replacement for getting and setting thread name: (#4312)
* In namespace ripple, introduces get_name function that takes a
  std:🧵:native_handle_type and returns a std::string.
* In namespace ripple, introduces get_name function that takes a
  std::thread or std::jthread and returns a std::string.
* In namespace ripple::this_thread, introduces get_name function
  that takes no parameters and returns the name of the current
  thread as a std::string.
* In namespace ripple::this_thread, introduces set_name function
  that takes a std::string_view and sets the name of the current
  thread.
* Intended to replace the beast utilities setCurrentThreadName
  and getCurrentThreadName.
2025-06-17 12:15:49 +09:00
tequ
852d5c4c7a fix bad merge (Remarks) 2025-06-16 00:19:12 +09:00
John Freeman
fbc976d0e2 Update dependencies (#4595)
Use the most recent versions in ConanCenter.

* Due to a bug in Clang 16, you may get a compile error:
  "call to 'async_teardown' is ambiguous"
  * A compiler flag workaround is documented in `BUILD.md`.
* At this time, building this with gcc 13 may require editing some files
  in `.conan/data`
  * A patch to support gcc13 may be added in a later PR.

---------

Co-authored-by: Scott Schurr <scott@ripple.com>
2025-06-15 23:18:30 +09:00
Gregory Tsipenyuk
1cba562b95 amm_info: fetch by amm account id; add AMM object entry (#4682)
- Update amm_info to fetch AMM by amm account id.
  - This is an additional way to retrieve an AMM object.
  - Alternatively, AMM can still be fetched by the asset pair as well.
- Add owner directory entry for AMM object.

Context:

- Add back the AMM object directory entry, which was deleted by #4626.
  - This fixes `account_objects` for `amm` type.
2025-06-15 23:14:45 +09:00
Rome Reginelli
92a37a7c38 AMMBid: use tecINTERNAL for 'impossible' errors (#4674)
Modify two error cases in AMMBid transactor to return `tecINTERNAL` to
more clearly indicate that these errors should not be possible unless
operating in unforeseen circumstances. It likely indicates a bug.

The log level has been updated to `fatal()` since it indicates a
(potentially network-wide) unexpected condition when either of these
errors occurs.

Details:

The two specific transaction error cases changed are:

- `tecAMM_BALANCE` - In this case, this error (total LP Tokens
  outstanding is lower than the amount to be burned for the bid) is a
  subset of the case where the user doesn't have enough LP Tokens to pay
  for the bid. When this case is reached, the bidder's LP Tokens balance
  has already been checked first. The user's LP Tokens should always be
  a subset of total LP Tokens issued, so this should be impossible.
- `tecINSUFFICIENT_PAYMENT` - In this case, the amount to be refunded as
  a result of the bid is greater than the price paid for the auction
  slot. This should never occur unless something is wrong with the math
  for calculating the refund amount.

Both error cases in question are "defense in depth" measures meant to
protect against making things worse if the code has already reached a
state that is supposed to be impossible, likely due to a bug elsewhere.

Such "shouldn't ever occur" checks should use an error code that
categorically indicates a larger problem. This is similar to how
`tecINVARIANT_FAILED` is a warning sign that something went wrong and
likely could've been worse, but since there isn't an Invariant Check
applying here, `tecINTERNAL` is the appropriate error code.

This is "debatably" a transaction processing change since it could
hypothetically change how transactions are processed if there's a bug we
don't know about.
2025-06-15 23:14:44 +09:00
Ikko Eltociear Ashimine
2d3885486e refactor: fix typo in FeeUnits.h (#4644)
covert -> convert
2025-06-15 23:14:44 +09:00
Arihant Kothari
ec2db6ad19 test: add forAllApiVersions helper function (#4611)
Introduce a new variadic template helper function, `forAllApiVersions`,
that accepts callables to execute a set of functions over a range of
versions - from RPC::apiMinimumSupportedVersion to RPC::apiBetaVersion.
This avoids the duplication of code.

Context: #4552
2025-06-15 23:14:43 +09:00
Mayukha Vadari
9e4a513b24 add view updates for account SLEs (#4629)
Signed-off-by: Manoj Doshi <mdoshi@ripple.com>
2025-06-15 23:14:43 +09:00
John Freeman
537831ffaf Fix the package recipe for consumers of libxrpl (#4631)
- "Rename" the type `LedgerInfo` to `LedgerHeader` (but leave an alias
  for `LedgerInfo` to not yet disturb existing uses). Put it in its own
  public header, named after itself, so that it is more easily found.
- Move the type `Fees` and NFT serialization functions into public
  (installed) headers.
- Compile the XRPL and gRPC protocol buffers directly into `libxrpl` and
  install their headers. Fix the Conan recipe to correctly export these
  types.

Addresses change (2) in
https://github.com/XRPLF/XRPL-Standards/discussions/121.

For context: This work supports Clio's dependence on libxrpl. Clio is
just an example consumer. These changes should benefit all current and
future consumers.

---------

Co-authored-by: cyan317 <120398799+cindyyan317@users.noreply.github.com>
Signed-off-by: Manoj Doshi <mdoshi@ripple.com>
2025-06-15 23:14:43 +09:00
John Freeman
c86746dc48 Fix package definition for Conan (#4485)
Fix the libxrpl library target for consumers using Conan.

* Fix installation issues and update includes.
* Update requirements in the Conan package info.
  * libxrpl requires openssl::crypto.

(Conan is a software package manager for C++.)
2025-06-15 23:14:42 +09:00
Alphonse Noni Mousse
f45eb094e5 refactor: improve checking of path lengths (#4519)
Improve the checking of the path lengths during Payments. Previously,
the code that did the check of the payment path lengths was sometimes
executed, but without any effect. This changes it to only check when it
matters, and to not make unnecessary copies of the path vectors.

Signed-off-by: Manoj Doshi <mdoshi@ripple.com>
2025-06-15 23:09:45 +09:00
Alphonse N. Mousse
1ab04d7d56 refactor: use C++20 function std::popcount (#4389)
- Replace custom popcnt16 implementation with std::popcount from C++20
- Maintain compatibility with older compilers and MacOS by providing a
  conditional compilation fallback to __builtin_popcount and a lookup
  table method
- Move and inline related functions within SHAMapInnerNode for
  performance and readability

Signed-off-by: Manoj Doshi <mdoshi@ripple.com>
2025-06-15 23:09:44 +09:00
Gregory Tsipenyuk
e01f88eaa3 fix(AMM): prevent orphaned objects, inconsistent ledger state: (#4626)
When an AMM account is deleted, the owner directory entries must be
deleted in order to ensure consistent ledger state.

* When deleting AMM account:
  * Clean up AMM owner dir, linking AMM account and AMM object
  * Delete trust lines to AMM
* Disallow `CheckCreate` to AMM accounts
  * AMM cannot cash a check
* Constrain entries in AuthAccounts array to be accounts
  * AuthAccounts is an array of objects for the AMMBid transaction
* SetTrust (TrustSet): Allow on AMM only for LP tokens
  * If the destination is an AMM account and the trust line doesn't
    exist, then:
    * If the asset is not the AMM LP token, then fail the tx with
      `tecNO_PERMISSION`
    * If the AMM is in empty state, then fail the tx with `tecAMM_EMPTY`
      * This disallows trustlines to AMM in empty state
* Add AMMID to AMM root account
  * Remove lsfAMM flag and use sfAMMID instead
* Remove owner dir entry for ltAMM
* Add `AMMDelete` transaction type to handle amortized deletion
  * Limit number of trust lines to delete on final withdraw + AMMDelete
  * Put AMM in empty state when LPTokens is 0 upon final withdraw
  * Add `tfTwoAssetIfEmpty` deposit option in AMM empty state
  * Fail all AMM transactions in AMM empty state except special deposit
  * Add `tecINCOMPLETE` to indicate that not all AMM trust lines are
    deleted (i.e. partial deletion)
    * This is handled in Transactor similar to deleted offers
  * Fail AMMDelete with `tecINTERNAL` if AMM root account is nullptr
  * Don't validate for invalid asset pair in AMMDelete
* AMMWithdraw deletes AMM trust lines and AMM account/object only if the
  number of trust lines is less than max
  * Current `maxDeletableAMMTrustLines` = 512
  * Check no directory left after AMM trust lines are deleted
  * Enable partial trustline deletion in AMMWithdraw
* Add `tecAMM_NOT_EMPTY` to fail any transaction that expects an AMM in
  empty state
* Clawback considerations
  * Disallow clawback out of AMM account
  * Disallow AMM create if issuer can claw back

This patch applies to the AMM implementation in #4294.

Acknowledgements:
Richard Holland and Nik Bougalis for responsibly disclosing this issue.

Bug Bounties and Responsible Disclosures:
We welcome reviews of the project code and urge researchers to
responsibly disclose any issues they may find.

To report a bug, please send a detailed report to:

    bugs@xrpl.org

Signed-off-by: Manoj Doshi <mdoshi@ripple.com>
2025-06-15 23:09:42 +09:00
RichardAH
2b0ea30a1e feat: support Concise Transaction Identifier (CTID) (XLS-37) (#4418)
* add CTIM to tx rpc

---------

Co-authored-by: Rome Reginelli <mduo13@gmail.com>
Co-authored-by: Elliot Lee <github.public@intelliot.com>
Co-authored-by: Denis Angell <dangell@transia.co>
2025-06-15 23:09:04 +09:00
Shawn Xie
fdb9a01b72 Rename allowClawback flag to allowTrustLineClawback (#4617)
Reason for this change is here XRPLF/XRPL-Standards#119

We would want to be explicit that this flag is exclusively for trustline. For new token types(eg. CFT), they will not utilize this flag for clawback, instead, they will turn clawback on/off on the token-level, which is more versatile.
2025-06-15 23:09:03 +09:00
Gregory Tsipenyuk
5ea0c2cbdc Introduce AMM support (XLS-30d): (#4294)
Add AMM functionality:
- InstanceCreate
- Deposit
- Withdraw
- Governance
- Auctioning
- payment engine integration

To support this functionality, add:
- New RPC method, `amm_info`, to fetch pool and LPT balances
- AMM Root Account
- trust line for each IOU AMM token
- trust line to track Liquidity Provider Tokens (LPT)
- `ltAMM` object

The `ltAMM` object tracks:
- fee votes
- auction slot bids
- AMM tokens pair
- total outstanding tokens balance
- `AMMID` to AMM `RootAccountID` mapping

Add new classes to facilitate AMM integration into the payment engine.
`BookStep` uses these classes to infer if AMM liquidity can be consumed.

The AMM formula implementation uses the new Number class added in #4192.
IOUAmount and STAmount use Number arithmetic.

Add AMM unit tests for all features.

AMM requires the following amendments:
- featureAMM
- fixUniversalNumber
- featureFlowCross

Notes:
- Current trading fee threshold is 1%
- AMM currency is generated by: 0x03 + 152 bits of sha256{cur1, cur2}
- Current max AMM Offers is 30

---------

Co-authored-by: Howard Hinnant <howard.hinnant@gmail.com>
2025-06-15 23:09:01 +09:00
Elliot Lee
6edded0e21 docs(CONTRIBUTING): push beta releases to release (#4589)
Sections that were rewrapped were wrapped to 72 characters, the same as
the recommendation for commit messages.
2025-06-15 23:07:39 +09:00
Arihant Kothari
bcbd592057 APIv2(ledger_entry): return "invalidParams" when fields missing (#4552)
Improve error handling for ledger_entry by returning an "invalidParams"
error when one or more request fields are specified incorrectly, or one
or more required fields are missing.

For example, if none of of the following fields is provided, then the
API should return an invalidParams error:
* index, account_root, directory, offer, ripple_state, check, escrow,
  payment_channel, deposit_preauth, ticket

Prior to this commit, the API returned an "unknownOption" error instead.
Since the error was actually due to invalid parameters, rather than
unknown options, this error was misleading.

Since this is an API breaking change, the "invalidParams" error is only
returned for requests using api_version: 2 and above. To maintain
backward compatibility, the "unknownOption" error is still returned for
api_version: 1.

Related: #4573

Fix #4303
2025-06-15 23:07:39 +09:00
Chenna Keshava B S
9b73fcc4f2 refactor: change the return type of mulDiv to std::optional (#4243)
- Previously, mulDiv had `std::pair<bool, uint64_t>` as the output type.
  - This is an error-prone interface as it is easy to ignore when
    overflow occurs.
- Using a return type of `std::optional` should decrease the likelihood
  of ignoring overflow.
  - It also allows for the use of optional::value_or() as a way to
    explicitly recover from overflow.
- Include limits.h header file preprocessing directive in order to
  satisfy gcc's numeric_limits incomplete_type requirement.

Fix #3495

---------

Co-authored-by: John Freeman <jfreeman08@gmail.com>
2025-06-15 23:07:38 +09:00
Shawn Xie
332eca79c2 fix: add allowClawback flag for account_info (#4590)
* Update the `account_info` API so that the `allowClawback` flag is
  included in the response.
  * The proposed `Clawback` amendement added an `allowClawback` flag in
    the `AccountRoot` object.
  * In the API response, under `account_flags`, there is now an
    `allowClawback` field with a boolean (`true` or `false`) value.
  * For reference, the XLS-39 Clawback implementation can be found in
    #4553

Fix #4588
2025-06-15 23:07:38 +09:00
Peter Chen
1324531868 APIv2(account_info): handle invalid "signer_lists" value (#4585)
When requesting `account_info` with an invalid `signer_lists` value, the
API should return an "invalidParams" error.

`signer_lists` should have a value of type boolean. If it is not a
boolean, then it is invalid input. The response now indicates that.

* This is an API breaking change, so the change is only reflected for
  requests containing `"api_version": 2`
* Fix #4539
2025-06-15 23:07:37 +09:00
Chenna Keshava B S
0236214f95 fix: Update Handler::Condition enum values #3417 (#4239)
- Use powers of two to clearly indicate the bitmask
- Replace bitmask with explicit if-conditions to better indicate predicates

Change enum values to be powers of two (fix #3417) #4239

Implement the simplified condition evaluation
removes the complex bitwise and(&) operator
Implement the second proposed solution in Nik Bougalis's comment - Software does not distinguish between different Conditions (Version: 1.5) #3417 (comment)
I have tested this code change by performing RPC calls with the commands server_info, server_state, peers and validation_info. These commands worked as expected.
2025-06-15 23:07:37 +09:00
Peter Chen
b3b6c2fe12 APIv2: add error messages for account_tx (#4571)
Certain inputs for the AccountTx method should return an error. In other
words, an invalid request from a user or client now results in an error
message.

Since this can change the response from the API, it is an API breaking
change. This commit maintains backward compatibility by keeping the
existing behavior for existing requests. When clients specify
"api_version": 2, they will be able to get the updated error messages.

Update unit tests to check the error based on the API version.

* Fix #4288
* Fix #4545
2025-06-15 23:07:36 +09:00
Ed Hennis
817cf79755 Fix build references to deleted ServerHandlerImp: (#4592)
* Commits 0b812cd (#4427) and 11e914f (#4516) conflict. The first added
  references to `ServerHandlerImp` in files outside of that class's
  organizational unit (which is technically incorrect). The second
  removed `ServerHandlerImp`, but was not up to date with develop. This
  results in the build failing.
* Fixes the build by changing references to `ServerHandlerImp` to
  the more correct `ServerHandler`.
2025-06-15 23:07:36 +09:00
Scott Schurr
344b6fd098 refactor: rename ServerHandlerImp to ServerHandler (#4516)
Rename `ServerHandlerImp` to `ServerHandler`. There was no other
ServerHandler definition despite the existence of a header suggesting
that there was.

This resolves a piece of historical confusion in the code, which was
identified during a code review.

The changes in the diff may look more extensive than they actually are.
The contents of `impl/ServerHandlerImp.h` were merged into
`ServerHandler.h`, making the latter file appear to have undergone
significant modifications. However, this is a non-breaking refactor that
only restructures code.
2025-06-15 23:07:36 +09:00
Chenna Keshava B S
bde80258d5 fix: remove deprecated fields in ledger method (#4244)
Remove deprecated fields from the ledger command:
* accepted
* hash (use ledger_hash instead)
* seqNum (use ledger_index instead)
* totalCoins (use total_coins instead)

Update SHAMapStore unit tests to use `jss:ledger_hash` instead of the
deprecated `hash` field.

Fix #3214
2025-06-15 23:07:35 +09:00
Denis Angell
004ab91023 refactor: replace hand-rolled lexicalCast (#4473)
Replace hand-rolled code with std::from_chars for better
maintainability.

The C++ std::from_chars function is intended to be as fast as possible,
so it is unlikely to be slower than the code it replaces. This change is
a net gain because it reduces the amount of hand-rolled code.
2025-06-15 23:07:35 +09:00
Shawn Xie
1a600b34b3 XLS-39 Clawback: (#4553)
Introduces:
* AccountRoot flag: lsfAllowClawback
* New Clawback transaction
* More info on clawback spec: https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-39d-clawback
2025-06-15 23:07:32 +09:00
Howard Hinnant
72bf49baca refactor: remove TypedField's move constructor (#4567)
Apply a minor cleanup in `TypedField`:
* Remove a non-working and unused move constructor.
* Constrain the remaining constructor to not be overly generic enough as
  to be used as a copy or move constructor.
2025-06-15 23:06:45 +09:00
drlongle
4796e07aaa Add RPC/WS ports to server_info (#4427)
Enhance the /crawl endpoint by publishing WebSocket/RPC ports in the
server_info response. The function processing requests to the /crawl
endpoint actually calls server_info internally, so this change enables a
server to advertise its WebSocket/RPC port(s) to peers via the /crawl
endpoint. `grpc` and `peer` ports are included as well.

The new `ports` array contains objects, each containing a `port` for the
listening port (number string), and an array `protocol` listing the
supported protocol(s).

This allows crawlers to build a richer topology without needing to
port-scan nodes. For non-admin users (including peers), the info about
*admin* ports is excluded.

Also increase test coverage for RPC ServerInfo.

Fix #2837.
2025-06-15 23:06:45 +09:00
Scott Schurr
2086c47d1d fixReducedOffersV1: prevent offers from blocking order books: (#4512)
Curtail the occurrence of order books that are blocked by reduced offers
with the implementation of the fixReducedOffersV1 amendment.

This commit identifies three ways in which offers can be reduced:

1. A new offer can be partially crossed by existing offers, so the new
   offer is reduced when placed in the ledger.

2. An in-ledger offer can be partially crossed by a new offer in a
   transaction. So the in-ledger offer is reduced by the new offer.

3. An in-ledger offer may be under-funded. In this case the in-ledger
   offer is scaled down to match the available funds.

Reduced offers can block order books if the effective quality of the
reduced offer is worse than the quality of the original offer (from the
perspective of the taker). It turns out that, for small values, the
quality of the reduced offer can be significantly affected by the
rounding mode used during scaling computations.

This commit adjusts some rounding modes so that the quality of a reduced
offer is always at least as good (from the taker's perspective) as the
original offer.

The amendment is titled fixReducedOffersV1 because additional ways of
producing reduced offers may come to light. Therefore, there may be a
future need for a V2 amendment.
2025-06-15 23:06:42 +09:00
Ed Hennis
fb6734f589 Enable the Beta RPC API (v2) for all unit tests: (#4573)
* Enable api_version 2, which is currently in beta. It is expected to be
  marked stable by the next stable release.
* This does not change any defaults.
* The only existing tests changed were one that set the same flag, which
  was now redundant, and a couple that tested versioning explicitly.
2025-06-15 23:05:35 +09:00
Denis Angell
551fc9f9e1 Add Conan Building For Development (#432) 2025-05-14 14:00:20 +10:00
RichardAH
8173ef558c Sus pat (#507) 2025-05-01 17:23:56 +10:00
RichardAH
59b071f38c remove false positives from sus pat finder (#506) 2025-05-01 09:54:41 +10:00
Denis Angell
648062fd56 fix warnings (#505) 2025-04-30 11:51:58 +02:00
tequ
3fd3be1220 Suppress build warning introduced in Catalogue (#499) 2025-04-29 08:25:55 +10:00
tequ
79887d1181 Supress logs for Catalogue_test, Import_test (#495) 2025-04-24 17:46:09 +10:00
Denis Angell
18e0585249 fix: ledger_index (#498) 2025-04-24 16:45:01 +10:00
tequ
ec50bfd5ef Remove #ifndef DEBUG guards and exception handling wrappers (#496) 2025-04-24 16:38:14 +10:00
Denis Angell
b0e511ad92 patch remarks (#497) 2025-04-24 16:36:57 +10:00
tequ
9c332dd59f Add tests for SetRemarks (#491) 2025-04-18 09:34:44 +10:00
tequ
85004a2adc Update clang-format workflow (#490) 2025-04-17 16:16:59 +10:00
RichardAH
ca2509684c Remarks amendment (#301)
Co-authored-by: Denis Angell <dangell@transia.co>
2025-04-16 08:42:04 +10:00
tequ
1125d17798 fixRewardClaimFlags (#487) 2025-04-15 20:08:19 +10:00
tequ
f20977914b HookCanEmit (#392) 2025-04-15 13:32:35 +10:00
Niq Dudfield
a3419bfa58 feat(catalogue): add cli commands and fix file_size (#486)
* feat(catalogue): add cli commands and fix file_size

* feat(catalogue): add cli commands and fix file_size

* feat(catalogue): fix tests

* feat(catalogue): fix tests

* feat(catalogue): use formatBytesIEC

* feat: add file_size_estimated

* feat: add file_size_estimated

* feat: add file_size_estimated
2025-04-15 08:50:15 +10:00
tequ
110b6c1a7b Update sfcodes script (#479) 2025-04-10 09:44:31 +10:00
tequ
c169cb5091 Update CHooks build script (#465) 2025-04-09 20:22:34 +10:00
tequ
3ae474b29b Add xpop_slot test (#470) 2025-04-09 20:20:23 +10:00
tequ
cbfc790f9b feat: Run unittests in parallel with Github Actions (#483)
Implement parallel execution for unit tests using Github Actions to improve CI pipeline efficiency and reduce build times.
2025-04-04 19:32:47 +02:00
Niq Dudfield
5307bedc5c Fix missing includes in Catalogue.cpp for non-unity builds (#485) 2025-04-04 12:53:45 +10:00
Niq Dudfield
ebd2583e94 Fix using using Status with rpcError (#484) 2025-04-01 21:00:13 +10:00
RichardAH
515a77379b Catalogue (#443) 2025-04-01 16:47:48 +10:00
Niq Dudfield
c795abd6e2 Fix ServerDefinitions_test regression intro in #475 (#477) 2025-03-19 12:32:27 +10:00
Niq Dudfield
f66c0d6dc3 Prevent dangling reference in getHash() (#475)
Replace temporary uint256 with static variable when returning fallback hash
to avoid returning a const reference to a local temporary object.
2025-03-18 18:37:18 +10:00
Niq Dudfield
224684ab8f CI Release Builder (#455) 2025-03-11 13:19:28 +01:00
RichardAH
b3abe703f4 Touch Amendment (#294) 2025-03-06 08:25:42 +01:00
Niq Dudfield
0f13043db1 fix: remove negative rate test failing on MacOS (#452) 2025-03-03 13:12:13 +01:00
Denis Angell
ca908cb0f4 [fix] github runner (#451)
Co-authored-by: Niq Dudfield <ndudfield@gmail.com>
2025-03-03 09:55:51 +01:00
tequ
9a6d6e2930 Enhance shell script error handling and debugging on GHA (#447) 2025-02-24 10:33:21 +01:00
tequ
8fa3eb39ac Fix Error handling on build action (#412) 2025-02-24 18:16:21 +10:00
tequ
c293570248 Fixed not to use a large fixed range in the magic_enum. (#436) 2025-02-24 17:46:42 +10:00
Richard Holland
d1f1307b9f debug gh builds 2025-02-06 15:21:37 +11:00
Wietse Wind
938f462b1c Update build-in-docker.yml 2025-02-05 08:23:49 +01:00
Richard Holland
fcc36ebe09 Revert "debug account tx tests under release builder"
This reverts commit da8df63be3.

Revert "add strict filtering to account_tx api (#429)"

This reverts commit 317bd4bc6e.
2025-02-05 14:59:33 +11:00
Richard Holland
14128148f9 debug account tx tests under release builder 2025-02-04 17:02:17 +11:00
RichardAH
610bc2834f add strict filtering to account_tx api (#429) 2025-02-03 17:56:08 +10:00
RichardAH
d0ba15c964 fix20250131 (#428)
Co-authored-by: Denis Angell <dangell@transia.co>
2025-02-03 10:33:19 +10:00
Wietse Wind
c56d26046a Artifact v4 continue on error 2025-02-01 08:58:13 +01:00
Wietse Wind
9e10a77f23 Update artifact 2025-02-01 08:57:48 +01:00
Wietse Wind
4ca823c71c Update artifact 2025-02-01 08:57:25 +01:00
tequ
814d662bfd Fix HookResult(ExitType) when accept() is not called (#415) 2025-01-22 13:33:59 +10:00
tequ
f58487875a Update boost link for build-full.sh (#421) 2025-01-22 08:38:12 +10:00
tequ
8e446662b1 Add space to trace_float log (#424) 2025-01-22 08:34:33 +10:00
tequ
290f316760 add URITokenIssuer to account_flags for account_info (#404) 2024-12-16 16:10:01 +10:00
RichardAH
6431ee9af4 allow multiple datagram monitor endpoints (#408) 2024-12-14 08:44:40 +10:00
Richard Holland
74c1ee0712 fixReduceImport (#398)
Co-authored-by: Denis Angell <dangell@transia.co>
2024-12-11 13:29:37 +11:00
RichardAH
3dc72a26c4 Datagram monitor (#400)
Co-authored-by: Denis Angell <dangell@transia.co>
2024-12-11 13:29:30 +11:00
Denis Angell
320d6856e8 Fix: failing assert (#397) 2024-12-11 13:08:50 +11:00
Ekiserrepé
ffa64d2af1 Update README.md (#396)
Updated Xaman link.
2024-12-11 13:08:50 +11:00
Richard Holland
af60b3e117 UDP RPC (admin) support (#390) 2024-12-11 13:08:44 +11:00
RichardAH
b4f27f846c Limit xahau genesis to networks starting with 2133X (#395) 2024-11-23 21:19:09 +10:00
708 changed files with 55858 additions and 31061 deletions

View File

@@ -1,31 +0,0 @@
name: 'Configure ccache'
description: 'Sets up ccache with consistent configuration'
inputs:
max_size:
description: 'Maximum cache size'
required: false
default: '2G'
hash_dir:
description: 'Whether to include directory paths in hash'
required: false
default: 'true'
compiler_check:
description: 'How to check compiler for changes'
required: false
default: 'content'
runs:
using: 'composite'
steps:
- name: Configure ccache
shell: bash
run: |
mkdir -p ~/.ccache
export CONF_PATH="${CCACHE_CONFIGPATH:-${CCACHE_DIR:-$HOME/.ccache}/ccache.conf}"
mkdir -p $(dirname "$CONF_PATH")
echo "max_size = ${{ inputs.max_size }}" > "$CONF_PATH"
echo "hash_dir = ${{ inputs.hash_dir }}" >> "$CONF_PATH"
echo "compiler_check = ${{ inputs.compiler_check }}" >> "$CONF_PATH"
ccache -p # Print config for verification
ccache -z # Zero statistics before the build

View File

@@ -21,13 +21,17 @@ inputs:
required: false
default: ''
compiler-id:
description: 'Unique identifier for compiler/version combination used for cache keys'
description: 'Unique identifier: compiler-version-stdlib[-gccversion] (e.g. clang-14-libstdcxx-gcc11, gcc-13-libstdcxx)'
required: false
default: ''
cache_version:
description: 'Cache version for invalidation'
required: false
default: '1'
gha_cache_enabled:
description: 'Whether to use actions/cache (disable for self-hosted with volume mounts)'
required: false
default: 'true'
ccache_enabled:
description: 'Whether to use ccache'
required: false
@@ -36,6 +40,29 @@ inputs:
description: 'Main branch name for restore keys'
required: false
default: 'dev'
stdlib:
description: 'C++ standard library to use'
required: true
type: choice
options:
- libstdcxx
- libcxx
clang_gcc_toolchain:
description: 'GCC version to use for Clang toolchain (e.g. 11, 13)'
required: false
default: ''
ccache_max_size:
description: 'Maximum ccache size'
required: false
default: '2G'
ccache_hash_dir:
description: 'Whether to include directory paths in hash'
required: false
default: 'true'
ccache_compiler_check:
description: 'How to check compiler for changes'
required: false
default: 'content'
runs:
using: 'composite'
@@ -48,18 +75,37 @@ runs:
SAFE_BRANCH=$(echo "${{ github.ref_name }}" | tr -c 'a-zA-Z0-9_.-' '-')
echo "name=${SAFE_BRANCH}" >> $GITHUB_OUTPUT
- name: Restore ccache directory
- name: Configure ccache
if: inputs.ccache_enabled == 'true'
id: ccache-restore
uses: actions/cache/restore@v4
with:
path: ~/.ccache
key: ${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-${{ steps.safe-branch.outputs.name }}
restore-keys: |
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-${{ inputs.main_branch }}
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-
shell: bash
run: |
# Create cache directories
mkdir -p ~/.ccache-cache
# Keep config separate from cache_dir so configs aren't swapped when CCACHE_DIR changes between steps
mkdir -p ~/.config/ccache
export CCACHE_CONFIGPATH="$HOME/.config/ccache/ccache.conf"
echo "CCACHE_CONFIGPATH=$CCACHE_CONFIGPATH" >> $GITHUB_ENV
# Keep config separate from cache_dir so configs aren't swapped when CCACHE_DIR changes between steps
mkdir -p ~/.config/ccache
export CCACHE_CONFIGPATH="$HOME/.config/ccache/ccache.conf"
echo "CCACHE_CONFIGPATH=$CCACHE_CONFIGPATH" >> $GITHUB_ENV
# Configure ccache settings AFTER cache restore (prevents stale cached config)
ccache --set-config=max_size=${{ inputs.ccache_max_size }}
ccache --set-config=hash_dir=${{ inputs.ccache_hash_dir }}
ccache --set-config=compiler_check=${{ inputs.ccache_compiler_check }}
ccache --set-config=cache_dir="$HOME/.ccache-cache"
echo "CCACHE_DIR=$HOME/.ccache-cache" >> $GITHUB_ENV
echo "📦 using ~/.ccache-cache as ccache cache directory"
# Print config for verification
echo "=== ccache configuration ==="
ccache -p
# Zero statistics before the build
ccache -z
- name: Configure project
shell: bash
@@ -75,36 +121,97 @@ runs:
if [ -n "${{ inputs.cxx }}" ]; then
export CXX="${{ inputs.cxx }}"
fi
# Configure ccache launcher args
CCACHE_ARGS=""
# Create wrapper toolchain that overlays ccache on top of Conan's toolchain
# This enables ccache for the main app build without affecting Conan dependency builds
if [ "${{ inputs.ccache_enabled }}" = "true" ]; then
CCACHE_ARGS="-DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache"
cat > wrapper_toolchain.cmake <<'EOF'
# Include Conan's generated toolchain first (sets compiler, flags, etc.)
# Note: CMAKE_CURRENT_LIST_DIR is the directory containing this wrapper (.build/)
include(${CMAKE_CURRENT_LIST_DIR}/build/generators/conan_toolchain.cmake)
# Overlay ccache configuration for main application build
# This does NOT affect Conan dependency builds (already completed)
set(CMAKE_C_COMPILER_LAUNCHER ccache CACHE STRING "C compiler launcher" FORCE)
set(CMAKE_CXX_COMPILER_LAUNCHER ccache CACHE STRING "C++ compiler launcher" FORCE)
EOF
TOOLCHAIN_FILE="wrapper_toolchain.cmake"
echo "✅ Created wrapper toolchain with ccache enabled"
else
TOOLCHAIN_FILE="build/generators/conan_toolchain.cmake"
echo " Using Conan toolchain directly (ccache disabled)"
fi
# Configure C++ standard library if specified
# libstdcxx used for clang-14/16 to work around missing lexicographical_compare_three_way in libc++
# libcxx can be used with clang-17+ which has full C++20 support
# Note: -stdlib flag is Clang-specific, GCC always uses libstdc++
CMAKE_CXX_FLAGS=""
if [[ "${{ inputs.cxx }}" == clang* ]]; then
# Only Clang needs the -stdlib flag
if [ "${{ inputs.stdlib }}" = "libstdcxx" ]; then
CMAKE_CXX_FLAGS="-stdlib=libstdc++"
elif [ "${{ inputs.stdlib }}" = "libcxx" ]; then
CMAKE_CXX_FLAGS="-stdlib=libc++"
fi
fi
# GCC always uses libstdc++ and doesn't need/support the -stdlib flag
# Configure GCC toolchain for Clang if specified
if [ -n "${{ inputs.clang_gcc_toolchain }}" ] && [[ "${{ inputs.cxx }}" == clang* ]]; then
# Extract Clang version from compiler executable name (e.g., clang++-14 -> 14)
clang_version=$(echo "${{ inputs.cxx }}" | grep -oE '[0-9]+$')
# Clang 16+ supports --gcc-install-dir (precise path specification)
# Clang <16 only has --gcc-toolchain (uses discovery heuristics)
if [ -n "$clang_version" ] && [ "$clang_version" -ge "16" ]; then
# Clang 16+ uses --gcc-install-dir (canonical, precise)
CMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS --gcc-install-dir=/usr/lib/gcc/x86_64-linux-gnu/${{ inputs.clang_gcc_toolchain }}"
else
# Clang 14-15 uses --gcc-toolchain (deprecated but necessary)
# Note: This still uses discovery, so we hide newer GCC versions in the workflow
CMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS --gcc-toolchain=/usr"
fi
fi
# Run CMake configure
# Note: conanfile.py hardcodes 'build/generators' as the output path.
# If we're in a 'build' folder, Conan detects this and uses just 'generators/'
# If we're in '.build' (non-standard), Conan adds the full 'build/generators/'
# So we get: .build/build/generators/ with our non-standard folder name
cmake .. \
-G "${{ inputs.generator }}" \
$CCACHE_ARGS \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }}
${CMAKE_CXX_FLAGS:+-DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS"} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=${TOOLCHAIN_FILE} \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }} \
-Dtests=TRUE \
-Dxrpld=TRUE
- name: Show ccache config before build
if: inputs.ccache_enabled == 'true'
shell: bash
run: |
echo "=========================================="
echo "ccache configuration before build"
echo "=========================================="
ccache -p
echo ""
- name: Build project
shell: bash
run: |
cd ${{ inputs.build_dir }}
cmake --build . --config ${{ inputs.configuration }} --parallel $(nproc)
# Check for verbose build flag in commit message
VERBOSE_FLAG=""
if echo "${XAHAU_GA_COMMIT_MSG}" | grep -q '\[ci-ga-cmake-verbose\]'; then
echo "🔊 [ci-ga-cmake-verbose] detected - enabling verbose output"
VERBOSE_FLAG="-- -v"
fi
cmake --build . --config ${{ inputs.configuration }} --parallel $(nproc) ${VERBOSE_FLAG}
- name: Show ccache statistics
if: inputs.ccache_enabled == 'true'
shell: bash
run: ccache -s
- name: Save ccache directory
if: inputs.ccache_enabled == 'true'
uses: actions/cache/save@v4
with:
path: ~/.ccache
key: ${{ steps.ccache-restore.outputs.cache-primary-key }}

View File

@@ -0,0 +1,146 @@
name: 'Cache Restore'
description: 'Restores cache with optional clearing based on commit message tags'
inputs:
path:
description: 'A list of files, directories, and wildcard patterns to cache'
required: true
key:
description: 'An explicit key for restoring the cache'
required: true
restore-keys:
description: 'An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key'
required: false
default: ''
cache-type:
description: 'Type of cache (for logging purposes, e.g., "ccache-main", "Conan")'
required: false
default: 'cache'
fail-on-cache-miss:
description: 'Fail the workflow if cache entry is not found'
required: false
default: 'false'
lookup-only:
description: 'Check if a cache entry exists for the given input(s) without downloading it'
required: false
default: 'false'
additional-clear-keys:
description: 'Additional cache keys to clear (newline separated)'
required: false
default: ''
outputs:
cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key'
value: ${{ steps.restore-cache.outputs.cache-hit }}
cache-primary-key:
description: 'The key that was used to restore the cache'
value: ${{ steps.restore-cache.outputs.cache-primary-key }}
cache-matched-key:
description: 'The key that was used to restore the cache (exact or prefix match)'
value: ${{ steps.restore-cache.outputs.cache-matched-key }}
runs:
using: 'composite'
steps:
- name: Clear cache if requested via commit message
shell: bash
env:
GH_TOKEN: ${{ github.token }}
run: |
echo "=========================================="
echo "${{ inputs.cache-type }} cache clear tag detection"
echo "=========================================="
echo "Searching for: [ci-ga-clear-cache] or [ci-ga-clear-cache:*]"
echo ""
CACHE_KEY="${{ inputs.key }}"
# Extract search terms if present (e.g., "ccache" from "[ci-ga-clear-cache:ccache]")
SEARCH_TERMS=$(echo "${XAHAU_GA_COMMIT_MSG}" | grep -o '\[ci-ga-clear-cache:[^]]*\]' | sed 's/\[ci-ga-clear-cache://;s/\]//' || echo "")
SHOULD_CLEAR=false
if [ -n "${SEARCH_TERMS}" ]; then
# Search terms provided - check if THIS cache key matches ALL terms (AND logic)
echo "🔍 [ci-ga-clear-cache:${SEARCH_TERMS}] detected"
echo "Checking if cache key matches search terms..."
echo " Cache key: ${CACHE_KEY}"
echo " Search terms: ${SEARCH_TERMS}"
echo ""
MATCHES=true
for term in ${SEARCH_TERMS}; do
if ! echo "${CACHE_KEY}" | grep -q "${term}"; then
MATCHES=false
echo " ✗ Key does not contain '${term}'"
break
else
echo " ✓ Key contains '${term}'"
fi
done
if [ "${MATCHES}" = "true" ]; then
echo ""
echo "✅ Cache key matches all search terms - will clear cache"
SHOULD_CLEAR=true
else
echo ""
echo "⏭️ Cache key doesn't match search terms - skipping cache clear"
fi
elif echo "${XAHAU_GA_COMMIT_MSG}" | grep -q '\[ci-ga-clear-cache\]'; then
# No search terms - always clear this job's cache
echo "🗑️ [ci-ga-clear-cache] detected in commit message"
echo "Clearing ${{ inputs.cache-type }} cache for key: ${CACHE_KEY}"
SHOULD_CLEAR=true
fi
if [ "${SHOULD_CLEAR}" = "true" ]; then
echo ""
echo "Deleting ${{ inputs.cache-type }} caches via GitHub API..."
# Delete primary cache key
echo "Checking for cache: ${CACHE_KEY}"
if gh cache list --key "${CACHE_KEY}" --json key --jq '.[].key' | grep -q "${CACHE_KEY}"; then
echo " Deleting: ${CACHE_KEY}"
gh cache delete "${CACHE_KEY}" || true
echo " ✓ Deleted"
else
echo " Not found"
fi
# Delete additional keys if provided
if [ -n "${{ inputs.additional-clear-keys }}" ]; then
echo ""
echo "Checking additional keys..."
while IFS= read -r key; do
[ -z "${key}" ] && continue
echo "Checking for cache: ${key}"
if gh cache list --key "${key}" --json key --jq '.[].key' | grep -q "${key}"; then
echo " Deleting: ${key}"
gh cache delete "${key}" || true
echo " ✓ Deleted"
else
echo " Not found"
fi
done <<< "${{ inputs.additional-clear-keys }}"
fi
echo ""
echo "✅ ${{ inputs.cache-type }} cache cleared successfully"
echo "Build will proceed from scratch"
else
echo ""
echo " No ${{ inputs.cache-type }} cache clear requested"
fi
echo "=========================================="
- name: Restore cache
id: restore-cache
uses: actions/cache/restore@v4
with:
path: ${{ inputs.path }}
key: ${{ inputs.key }}
restore-keys: ${{ inputs.restore-keys }}
fail-on-cache-miss: ${{ inputs.fail-on-cache-miss }}
lookup-only: ${{ inputs.lookup-only }}

View File

@@ -10,21 +10,46 @@ inputs:
required: false
default: '.build'
compiler-id:
description: 'Unique identifier for compiler/version combination used for cache keys'
description: 'Unique identifier: compiler-version-stdlib[-gccversion] (e.g. clang-14-libstdcxx-gcc11, gcc-13-libstdcxx)'
required: false
default: ''
cache_version:
description: 'Cache version for invalidation'
required: false
default: '1'
cache_enabled:
description: 'Whether to use caching'
required: false
default: 'true'
main_branch:
description: 'Main branch name for restore keys'
required: false
default: 'dev'
os:
description: 'Operating system (Linux, Macos)'
required: false
default: 'Linux'
arch:
description: 'Architecture (x86_64, armv8)'
required: false
default: 'x86_64'
compiler:
description: 'Compiler type (gcc, clang, apple-clang)'
required: true
compiler_version:
description: 'Compiler version (11, 13, 14, etc.)'
required: true
cc:
description: 'C compiler executable (gcc-13, clang-14, etc.), empty for macOS'
required: false
default: ''
cxx:
description: 'C++ compiler executable (g++-14, clang++-14, etc.), empty for macOS'
required: false
default: ''
stdlib:
description: 'C++ standard library for Conan configuration (note: also in compiler-id)'
required: true
type: choice
options:
- libstdcxx
- libcxx
outputs:
cache-hit:
@@ -34,36 +59,96 @@ outputs:
runs:
using: 'composite'
steps:
- name: Generate safe branch name
if: inputs.cache_enabled == 'true'
id: safe-branch
- name: Configure Conan cache paths
if: inputs.os == 'Linux'
shell: bash
run: |
SAFE_BRANCH=$(echo "${{ github.ref_name }}" | tr -c 'a-zA-Z0-9_.-' '-')
echo "name=${SAFE_BRANCH}" >> $GITHUB_OUTPUT
mkdir -p /.conan-cache/conan2 /.conan-cache/conan2_download /.conan-cache/conan2_sources
echo 'core.cache:storage_path=/.conan-cache/conan2' > ~/.conan2/global.conf
echo 'core.download:download_cache=/.conan-cache/conan2_download' >> ~/.conan2/global.conf
echo 'core.sources:download_cache=/.conan-cache/conan2_sources' >> ~/.conan2/global.conf
- name: Restore Conan cache
if: inputs.cache_enabled == 'true'
id: cache-restore-conan
uses: actions/cache/restore@v4
with:
path: |
~/.conan
~/.conan2
key: ${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ hashFiles('**/conanfile.txt', '**/conanfile.py') }}-${{ inputs.configuration }}
restore-keys: |
${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ hashFiles('**/conanfile.txt', '**/conanfile.py') }}-
${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-
${{ runner.os }}-conan-v${{ inputs.cache_version }}-
- name: Configure Conan cache paths
if: inputs.gha_cache_enabled == 'false'
shell: bash
# For self-hosted runners, register cache paths to be used as volumes
# This allows the cache to be shared between containers
run: |
mkdir -p /.conan-cache/conan2 /.conan-cache/conan2_download /.conan-cache/conan2_sources
echo 'core.cache:storage_path=/.conan-cache/conan2' > ~/.conan2/global.conf
echo 'core.download:download_cache=/.conan-cache/conan2_download' >> ~/.conan2/global.conf
echo 'core.sources:download_cache=/.conan-cache/conan2_sources' >> ~/.conan2/global.conf
- name: Configure Conan
shell: bash
run: |
# Create the default profile directory if it doesn't exist
mkdir -p ~/.conan2/profiles
# Determine the correct libcxx based on stdlib parameter
if [ "${{ inputs.stdlib }}" = "libcxx" ]; then
LIBCXX="libc++"
else
LIBCXX="libstdc++11"
fi
# Create profile with our specific settings
# This overwrites any cached profile to ensure fresh configuration
cat > ~/.conan2/profiles/default <<EOF
[settings]
arch=${{ inputs.arch }}
build_type=${{ inputs.configuration }}
compiler=${{ inputs.compiler }}
compiler.cppstd=20
compiler.libcxx=${LIBCXX}
compiler.version=${{ inputs.compiler_version }}
os=${{ inputs.os }}
EOF
# Add buildenv and conf sections for Linux (not needed for macOS)
if [ "${{ inputs.os }}" = "Linux" ] && [ -n "${{ inputs.cc }}" ]; then
cat >> ~/.conan2/profiles/default <<EOF
[buildenv]
CC=/usr/bin/${{ inputs.cc }}
CXX=/usr/bin/${{ inputs.cxx }}
[conf]
tools.build:compiler_executables={"c": "/usr/bin/${{ inputs.cc }}", "cpp": "/usr/bin/${{ inputs.cxx }}"}
EOF
fi
# Add macOS-specific conf if needed
if [ "${{ inputs.os }}" = "Macos" ]; then
cat >> ~/.conan2/profiles/default <<EOF
[conf]
# Workaround for gRPC with newer Apple Clang
tools.build:cxxflags=["-Wno-missing-template-arg-list-after-template-kw"]
EOF
fi
# Display profile for verification
conan profile show
- name: Export custom recipes
shell: bash
run: |
conan export external/snappy snappy/1.1.10@
conan export external/soci soci/4.0.3@
# Export snappy if not already exported
conan list snappy/1.1.10@xahaud/stable 2>/dev/null | (grep -q "not found" && exit 1 || exit 0) || \
conan export external/snappy --version 1.1.10 --user xahaud --channel stable
# Export soci if not already exported
conan list soci/4.0.3@xahaud/stable 2>/dev/null | (grep -q "not found" && exit 1 || exit 0) || \
conan export external/soci --version 4.0.3 --user xahaud --channel stable
# Export wasmedge if not already exported
conan list wasmedge/0.11.2@xahaud/stable 2>/dev/null | (grep -q "not found" && exit 1 || exit 0) || \
conan export external/wasmedge --version 0.11.2 --user xahaud --channel stable
- name: Install dependencies
shell: bash
env:
CONAN_REQUEST_TIMEOUT: 180 # Increase timeout to 3 minutes for slow mirrors
run: |
# Create build directory
mkdir -p ${{ inputs.build_dir }}
@@ -74,15 +159,4 @@ runs:
--output-folder . \
--build missing \
--settings build_type=${{ inputs.configuration }} \
-Dtests=TRUE \
-Dxrpld=TRUE \
..
- name: Save Conan cache
if: inputs.cache_enabled == 'true' && steps.cache-restore-conan.outputs.cache-hit != 'true'
uses: actions/cache/save@v4
with:
path: |
~/.conan
~/.conan2
key: ${{ steps.cache-restore-conan.outputs.cache-primary-key }}

View File

@@ -0,0 +1,74 @@
name: 'Get Commit Message'
description: 'Gets commit message for both push and pull_request events and sets XAHAU_GA_COMMIT_MSG env var'
inputs:
event-name:
description: 'The event name (push or pull_request)'
required: true
head-commit-message:
description: 'The head commit message (for push events)'
required: false
default: ''
pr-head-sha:
description: 'The PR head SHA (for pull_request events)'
required: false
default: ''
runs:
using: 'composite'
steps:
- name: Get commit message and set environment variable
shell: python
env:
GH_TOKEN: ${{ github.token }}
run: |
import json
import os
import secrets
import urllib.request
event_name = "${{ inputs.event-name }}"
pr_head_sha = "${{ inputs.pr-head-sha }}"
repository = "${{ github.repository }}"
print("==========================================")
print("Setting XAHAU_GA_COMMIT_MSG environment variable")
print("==========================================")
print(f"Event: {event_name}")
if event_name == 'push':
# For push events, use the input directly
message = """${{ inputs.head-commit-message }}"""
print("Source: workflow input (github.event.head_commit.message)")
elif event_name == 'pull_request' and pr_head_sha:
# For PR events, fetch via GitHub API
print(f"Source: GitHub API (fetching commit {pr_head_sha})")
try:
url = f"https://api.github.com/repos/{repository}/commits/{pr_head_sha}"
req = urllib.request.Request(url, headers={
"Accept": "application/vnd.github.v3+json",
"Authorization": f"Bearer {os.environ.get('GH_TOKEN', '')}"
})
with urllib.request.urlopen(req) as response:
data = json.load(response)
message = data["commit"]["message"]
except Exception as e:
print(f"Failed to fetch commit message: {e}")
message = ""
else:
message = ""
print(f"Warning: Unknown event type: {event_name}")
print(f"Commit message (first 100 chars): {message[:100]}")
# Write to GITHUB_ENV using heredoc with random delimiter (prevents injection attacks)
# See: https://securitylab.github.com/resources/github-actions-untrusted-input/
delimiter = f"EOF_{secrets.token_hex(16)}"
with open(os.environ['GITHUB_ENV'], 'a') as f:
f.write(f'XAHAU_GA_COMMIT_MSG<<{delimiter}\n')
f.write(message)
f.write(f'\n{delimiter}\n')
print(f"✓ XAHAU_GA_COMMIT_MSG set (available to all subsequent steps)")
print("==========================================")

View File

@@ -32,19 +32,9 @@ jobs:
clean: true
fetch-depth: 2 # Only get the last 2 commits, to avoid fetching all history
checkpatterns:
runs-on: [self-hosted, vanity]
needs: checkout
defaults:
run:
working-directory: ${{ needs.checkout.outputs.checkout_path }}
steps:
- name: Check for suspicious patterns
run: /bin/bash suspicious_patterns.sh
build:
runs-on: [self-hosted, vanity]
needs: [checkpatterns, checkout]
runs-on: [self-hosted, xahaud-build]
needs: [checkout]
defaults:
run:
working-directory: ${{ needs.checkout.outputs.checkout_path }}
@@ -84,7 +74,7 @@ jobs:
fi
tests:
runs-on: [self-hosted, vanity]
runs-on: [self-hosted, xahaud-build]
needs: [build, checkout]
defaults:
run:
@@ -94,7 +84,7 @@ jobs:
run: /bin/bash docker-unit-tests.sh
cleanup:
runs-on: [self-hosted, vanity]
runs-on: [self-hosted, xahaud-build]
needs: [tests, checkout]
if: always()
steps:

View File

@@ -1,20 +0,0 @@
name: checkpatterns
on: [push, pull_request]
jobs:
checkpatterns:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Check for suspicious patterns
run: |
if [ -f "suspicious_patterns.sh" ]; then
bash suspicious_patterns.sh
else
echo "Warning: suspicious_patterns.sh not found, skipping check"
# Still exit with success for compatibility with dependent jobs
exit 0
fi

View File

@@ -0,0 +1,24 @@
name: Guard Checker Build
on:
push:
pull_request:
jobs:
guard-checker-build:
strategy:
fail-fast: false
matrix:
include:
- run-on: ubuntu-latest
- run-on: macos-latest
runs-on: ${{ matrix.run-on }}
name: Guard Checker Build - ${{ matrix.run-on }}
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Build Guard Checker
run: |
cd include/xrpl/hook
make guard_checker

104
.github/workflows/instrumentation.yml vendored Normal file
View File

@@ -0,0 +1,104 @@
name: instrumentation
on:
pull_request:
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows, instrumentation)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- 'ci/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
# NOTE we are not using dependencies built inside nix because nix is lagging
# with compiler versions. Instrumentation requires clang version 16 or later
instrumentation-build:
if: false # disable for now
env:
CLANG_RELEASE: 16
strategy:
fail-fast: false
runs-on: [self-hosted, heavy]
container: debian:bookworm
steps:
- name: install prerequisites
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update
apt-get install --yes --no-install-recommends \
clang-${CLANG_RELEASE} clang++-${CLANG_RELEASE} \
python3-pip python-is-python3 make cmake git wget
apt-get clean
update-alternatives --install \
/usr/bin/clang clang /usr/bin/clang-${CLANG_RELEASE} 100 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-${CLANG_RELEASE}
update-alternatives --auto clang
pip install --no-cache --break-system-packages "conan<2"
- name: checkout
uses: actions/checkout@v4
- name: prepare environment
run: |
mkdir ${GITHUB_WORKSPACE}/.build
echo "SOURCE_DIR=$GITHUB_WORKSPACE" >> $GITHUB_ENV
echo "BUILD_DIR=$GITHUB_WORKSPACE/.build" >> $GITHUB_ENV
echo "CC=/usr/bin/clang" >> $GITHUB_ENV
echo "CXX=/usr/bin/clang++" >> $GITHUB_ENV
- name: configure Conan
run: |
conan profile new --detect default
conan profile update settings.compiler=clang default
conan profile update settings.compiler.version=${CLANG_RELEASE} default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile update settings.compiler.cppstd=20 default
conan profile update options.rocksdb=False default
conan profile update \
'conf.tools.build:compiler_executables={"c": "/usr/bin/clang", "cpp": "/usr/bin/clang++"}' default
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
conan export external/snappy snappy/1.1.10@
conan export external/soci soci/4.0.3@
- name: build dependencies
run: |
cd ${BUILD_DIR}
conan install ${SOURCE_DIR} \
--output-folder ${BUILD_DIR} \
--install-folder ${BUILD_DIR} \
--build missing \
--settings build_type=Debug
- name: build with instrumentation
run: |
cd ${BUILD_DIR}
cmake -S ${SOURCE_DIR} -B ${BUILD_DIR} \
-Dvoidstar=ON \
-Dtests=ON \
-Dxrpld=ON \
-DCMAKE_BUILD_TYPE=Debug \
-DSECP256K1_BUILD_BENCHMARK=OFF \
-DSECP256K1_BUILD_TESTS=OFF \
-DSECP256K1_BUILD_EXHAUSTIVE_TESTS=OFF \
-DCMAKE_TOOLCHAIN_FILE=${BUILD_DIR}/build/generators/conan_toolchain.cmake
cmake --build . --parallel $(nproc)
- name: verify instrumentation enabled
run: |
cd ${BUILD_DIR}
./rippled --version | grep libvoidstar
- name: run unit tests
run: |
cd ${BUILD_DIR}
./rippled -u --unittest-jobs $(( $(nproc)/4 ))

View File

@@ -0,0 +1,50 @@
name: Verify Generated Hook Headers
on:
push:
pull_request:
jobs:
verify-generated-headers:
strategy:
fail-fast: false
matrix:
include:
- target: hook/error.h
generator: ./hook/generate_error.sh
- target: hook/extern.h
generator: ./hook/generate_extern.sh
- target: hook/sfcodes.h
generator: bash ./hook/generate_sfcodes.sh
- target: hook/tts.h
generator: ./hook/generate_tts.sh
runs-on: ubuntu-24.04
env:
CLANG_VERSION: 18
name: ${{ matrix.target }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install clang-format
run: |
codename=$( lsb_release --codename --short )
sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
EOF
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
sudo apt-get update
sudo apt-get install clang-format-${CLANG_VERSION}
sudo ln -sf /usr/bin/clang-format-${CLANG_VERSION} /usr/local/bin/clang-format
- name: Verify ${{ matrix.target }}
run: |
set -euo pipefail
chmod +x hook/generate_*.sh || true
tmp=$(mktemp)
trap 'rm -f "$tmp"' EXIT
${{ matrix.generator }} > "$tmp"
diff -u ${{ matrix.target }} "$tmp"

View File

@@ -4,7 +4,10 @@ on:
push:
branches: ["dev", "candidate", "release"]
pull_request:
branches: ["dev", "candidate", "release"]
branches: ["**"]
types: [opened, synchronize, reopened, labeled, unlabeled]
schedule:
- cron: '0 0 * * *'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -12,68 +15,57 @@ concurrency:
jobs:
test:
if: >
github.event_name != 'pull_request' ||
contains(fromJson('["dev","candidate","release"]'), github.base_ref) ||
contains(join(github.event.pull_request.labels.*.name, ','), 'ci-full-build')
strategy:
matrix:
generator:
- Ninja
configuration:
- Debug
runs-on: macos-15
runs-on: [self-hosted, macOS]
env:
build_dir: .build
# Bump this number to invalidate all caches globally.
CACHE_VERSION: 1
CACHE_VERSION: 3
MAIN_BRANCH_NAME: dev
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Conan
- name: Add Homebrew to PATH
run: |
brew install conan@1
# Add Conan 1 to the PATH for this job
echo "$(brew --prefix conan@1)/bin" >> $GITHUB_PATH
echo "/opt/homebrew/bin" >> "$GITHUB_PATH"
echo "/opt/homebrew/sbin" >> "$GITHUB_PATH"
- name: Install Coreutils
run: |
brew install coreutils
echo "Num proc: $(nproc)"
- name: Install Ninja
if: matrix.generator == 'Ninja'
run: brew install ninja
- name: Install Python
run: |
if which python3 > /dev/null 2>&1; then
echo "Python 3 executable exists"
python3 --version
else
brew install python@3.12
fi
# Create 'python' symlink if it doesn't exist (for tools expecting 'python')
if ! which python > /dev/null 2>&1; then
sudo ln -sf $(which python3) /usr/local/bin/python
fi
- name: Install CMake
run: |
if which cmake > /dev/null 2>&1; then
echo "cmake executable exists"
cmake --version
else
brew install cmake
fi
- name: Install ccache
run: brew install ccache
- name: Configure ccache
uses: ./.github/actions/xahau-configure-ccache
# To isolate environments for each Runner, instead of installing globally with brew,
# use mise to isolate environments for each Runner directory.
- name: Setup toolchain (mise)
uses: jdx/mise-action@v3.6.1
with:
max_size: 2G
hash_dir: true
compiler_check: content
cache: false
install: true
mise_toml: |
[tools]
cmake = "3.25.3"
python = "3.12"
pipx = "latest"
conan = "2"
ninja = "latest"
ccache = "latest"
- name: Install tools via mise
run: |
mise install
mise reshim
echo "$HOME/.local/share/mise/shims" >> "$GITHUB_PATH"
- name: Check environment
run: |
@@ -87,11 +79,20 @@ jobs:
echo "---- Full Environment ----"
env
- name: Configure Conan
- name: Get commit message
id: get-commit-message
uses: ./.github/actions/xahau-ga-get-commit-message
with:
event-name: ${{ github.event_name }}
head-commit-message: ${{ github.event.head_commit.message }}
pr-head-sha: ${{ github.event.pull_request.head.sha }}
- name: Detect compiler version
id: detect-compiler
run: |
conan profile new default --detect || true # Ignore error if profile exists
conan profile update settings.compiler.cppstd=20 default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
COMPILER_VERSION=$(clang --version | grep -oE 'version [0-9]+' | grep -oE '[0-9]+')
echo "compiler_version=${COMPILER_VERSION}" >> $GITHUB_OUTPUT
echo "Detected Apple Clang version: ${COMPILER_VERSION}"
- name: Install dependencies
uses: ./.github/actions/xahau-ga-dependencies
@@ -101,6 +102,11 @@ jobs:
compiler-id: clang
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
os: Macos
arch: armv8
compiler: apple-clang
compiler_version: ${{ steps.detect-compiler.outputs.compiler_version }}
stdlib: libcxx
- name: Build
uses: ./.github/actions/xahau-ga-build
@@ -111,6 +117,8 @@ jobs:
compiler-id: clang
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
stdlib: libcxx
ccache_max_size: '100G'
- name: Test
run: |

View File

@@ -4,31 +4,224 @@ on:
push:
branches: ["dev", "candidate", "release"]
pull_request:
branches: ["dev", "candidate", "release"]
branches: ["**"]
types: [opened, synchronize, reopened, labeled, unlabeled]
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
inputs:
full_matrix:
description: "Force full matrix (6 configs)"
required: false
default: "false"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-job:
runs-on: ubuntu-latest
matrix-setup:
if: >
github.event_name != 'pull_request' ||
contains(fromJson('["dev","candidate","release"]'), github.base_ref) ||
contains(join(github.event.pull_request.labels.*.name, ','), 'ci-full-build')
runs-on: [self-hosted, generic, 20.04]
container: python:3-slim
outputs:
matrix: ${{ steps.set-matrix.outputs.matrix }}
steps:
- name: escape double quotes
id: escape
shell: bash
env:
PR_TITLE: ${{ github.event.pull_request.title }}
run: |
ESCAPED_PR_TITLE="${PR_TITLE//\"/\\\"}"
echo "title=${ESCAPED_PR_TITLE}" >> "$GITHUB_OUTPUT"
- name: Generate build matrix
id: set-matrix
shell: python
env:
GH_TOKEN: ${{ github.token }}
run: |
import json
import os
import urllib.request
# Full matrix with all 6 compiler configurations
# Each configuration includes all parameters needed by the build job
full_matrix = [
{
"compiler_id": "gcc-11-libstdcxx",
"compiler": "gcc",
"cc": "gcc-11",
"cxx": "g++-11",
"compiler_version": 11,
"stdlib": "libstdcxx",
"configuration": "Debug"
},
{
"compiler_id": "gcc-13-libstdcxx",
"compiler": "gcc",
"cc": "gcc-13",
"cxx": "g++-13",
"compiler_version": 13,
"stdlib": "libstdcxx",
"configuration": "Debug"
},
{
"compiler_id": "clang-14-libstdcxx-gcc11",
"compiler": "clang",
"cc": "clang-14",
"cxx": "clang++-14",
"compiler_version": 14,
"stdlib": "libstdcxx",
"clang_gcc_toolchain": 11,
"configuration": "Debug"
},
{
"compiler_id": "clang-16-libstdcxx-gcc13",
"compiler": "clang",
"cc": "clang-16",
"cxx": "clang++-16",
"compiler_version": 16,
"stdlib": "libstdcxx",
"clang_gcc_toolchain": 13,
"configuration": "Debug"
},
{
"compiler_id": "clang-17-libcxx",
"compiler": "clang",
"cc": "clang-17",
"cxx": "clang++-17",
"compiler_version": 17,
"stdlib": "libcxx",
"configuration": "Debug"
},
{
# Clang 18 - testing if it's faster than Clang 17 with libc++
# Requires patching Conan v1 settings.yml to add version 18
"compiler_id": "clang-18-libcxx",
"compiler": "clang",
"cc": "clang-18",
"cxx": "clang++-18",
"compiler_version": 18,
"stdlib": "libcxx",
"configuration": "Debug"
}
]
# Minimal matrix for PRs and feature branches
minimal_matrix = [
full_matrix[1], # gcc-13 (middle-ground gcc)
full_matrix[2] # clang-14 (mature, stable clang)
]
# Determine which matrix to use based on the target branch
ref = "${{ github.ref }}"
base_ref = "${{ github.base_ref }}" # For PRs, this is the target branch
event_name = "${{ github.event_name }}"
pr_title = """${{ steps.escape.outputs.title }}"""
pr_labels = """${{ join(github.event.pull_request.labels.*.name, ',') }}"""
pr_head_sha = "${{ github.event.pull_request.head.sha }}"
# Get commit message - for PRs, fetch via API since head_commit.message is empty
if event_name == "pull_request" and pr_head_sha:
try:
url = f"https://api.github.com/repos/${{ github.repository }}/commits/{pr_head_sha}"
req = urllib.request.Request(url, headers={
"Accept": "application/vnd.github.v3+json",
"Authorization": f"Bearer {os.environ.get('GH_TOKEN', '')}"
})
with urllib.request.urlopen(req) as response:
data = json.load(response)
commit_message = data["commit"]["message"]
except Exception as e:
print(f"Failed to fetch commit message: {e}")
commit_message = ""
else:
commit_message = """${{ github.event.head_commit.message }}"""
# Debug logging
print(f"Event: {event_name}")
print(f"Ref: {ref}")
print(f"Base ref: {base_ref}")
print(f"PR head SHA: {pr_head_sha}")
print(f"PR title: {pr_title}")
print(f"PR labels: {pr_labels}")
print(f"Commit message: {commit_message}")
# Manual trigger input to force full matrix.
manual_full = "${{ github.event.inputs.full_matrix || 'false' }}" == "true"
# Label/manual overrides, while preserving existing title/commit behavior.
force_full = (
manual_full
or "[ci-nix-full-matrix]" in commit_message
or "[ci-nix-full-matrix]" in pr_title
or ("ci-full-build" in pr_labels and "ci-nix-full-matrix" in pr_labels)
)
force_min = (
"ci-full-build" in pr_labels
)
print(f"Force full matrix: {force_full}")
print(f"Force min matrix: {force_min}")
# Check if this is targeting a main branch
# For PRs: check base_ref (target branch)
# For pushes: check ref (current branch)
main_branches = ["refs/heads/dev", "refs/heads/release", "refs/heads/candidate"]
if force_full:
# Override: always use full matrix if forced by manual input or label.
use_full = True
elif force_min:
# Override: always use minimal matrix if ci-full-build label is present.
use_full = False
elif event_name == "pull_request":
# For PRs, base_ref is just the branch name (e.g., "dev", not "refs/heads/dev")
# Check if the PR targets release or candidate (more critical branches)
use_full = base_ref in ["release", "candidate"]
else:
# For pushes, ref is the full reference (e.g., "refs/heads/dev")
use_full = ref in main_branches
# Select the appropriate matrix
if use_full:
if force_full:
print(f"Using FULL matrix (6 configs) - forced by [ci-nix-full-matrix] tag")
else:
print(f"Using FULL matrix (6 configs) - targeting main branch")
matrix = full_matrix
else:
print(f"Using MINIMAL matrix (2 configs) - feature branch/PR")
matrix = minimal_matrix
# Output the matrix as JSON
output = json.dumps({"include": matrix})
with open(os.environ['GITHUB_OUTPUT'], 'a') as f:
f.write(f"matrix={output}\n")
build:
needs: matrix-setup
runs-on: [self-hosted, generic, 20.04]
container:
image: ubuntu:24.04
volumes:
- /home/runner/.conan-cache:/.conan-cache
- /home/runner/.ccache-cache:/github/home/.ccache-cache
defaults:
run:
shell: bash
outputs:
artifact_name: ${{ steps.set-artifact-name.outputs.artifact_name }}
strategy:
fail-fast: false
matrix:
compiler: [gcc]
configuration: [Debug]
include:
- compiler: gcc
cc: gcc-11
cxx: g++-11
compiler_id: gcc-11
matrix: ${{ fromJSON(needs.matrix-setup.outputs.matrix) }}
env:
build_dir: .build
# Bump this number to invalidate all caches globally.
CACHE_VERSION: 1
CACHE_VERSION: 3
MAIN_BRANCH_NAME: dev
steps:
- name: Checkout
@@ -36,36 +229,80 @@ jobs:
- name: Install build dependencies
run: |
sudo apt-get update
sudo apt-get install -y ninja-build ${{ matrix.cc }} ${{ matrix.cxx }} ccache
# Install specific Conan version needed
pip install --upgrade "conan<2"
apt-get update
apt-get install -y software-properties-common
add-apt-repository ppa:ubuntu-toolchain-r/test -y
apt-get update
apt-get install -y python3 python-is-python3 pipx
pipx ensurepath
apt-get install -y cmake ninja-build ${{ matrix.cc }} ${{ matrix.cxx }} ccache
apt-get install -y perl # for openssl build
apt-get install -y libsqlite3-dev # for xahaud build
- name: Configure ccache
uses: ./.github/actions/xahau-configure-ccache
with:
max_size: 2G
hash_dir: true
compiler_check: content
# Install the specific GCC version needed for Clang
if [ -n "${{ matrix.clang_gcc_toolchain }}" ]; then
echo "=== Installing GCC ${{ matrix.clang_gcc_toolchain }} for Clang ==="
apt-get install -y gcc-${{ matrix.clang_gcc_toolchain }} g++-${{ matrix.clang_gcc_toolchain }} libstdc++-${{ matrix.clang_gcc_toolchain }}-dev
- name: Configure Conan
run: |
conan profile new default --detect || true # Ignore error if profile exists
conan profile update settings.compiler.cppstd=20 default
conan profile update settings.compiler=${{ matrix.compiler }} default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile update env.CC=/usr/bin/${{ matrix.cc }} default
conan profile update env.CXX=/usr/bin/${{ matrix.cxx }} default
conan profile update conf.tools.build:compiler_executables='{"c": "/usr/bin/${{ matrix.cc }}", "cpp": "/usr/bin/${{ matrix.cxx }}"}' default
# Set correct compiler version based on matrix.compiler
if [ "${{ matrix.compiler }}" = "gcc" ]; then
conan profile update settings.compiler.version=11 default
elif [ "${{ matrix.compiler }}" = "clang" ]; then
conan profile update settings.compiler.version=14 default
echo "=== GCC versions available after installation ==="
ls -la /usr/lib/gcc/x86_64-linux-gnu/ | grep -E "^d"
fi
# Display profile for verification
conan profile show default
# For Clang < 16 with --gcc-toolchain, hide newer GCC versions
# This is needed because --gcc-toolchain still picks the highest version
#
# THE GREAT GCC HIDING TRICK (for Clang < 16):
# Clang versions before 16 don't have --gcc-install-dir, only --gcc-toolchain
# which is deprecated and still uses discovery heuristics that ALWAYS pick
# the highest version number. So we play a sneaky game...
#
# We rename newer GCC versions to very low integers (1, 2, 3...) which makes
# Clang think they're ancient GCC versions. Since 11 > 3 > 2 > 1, Clang will
# pick GCC 11 over our renamed versions. It's dumb but it works!
#
# Example: GCC 12→1, GCC 13→2, GCC 14→3, so Clang picks 11 (highest number)
if [ -n "${{ matrix.clang_gcc_toolchain }}" ] && [ "${{ matrix.compiler_version }}" -lt "16" ]; then
echo "=== Hiding GCC versions newer than ${{ matrix.clang_gcc_toolchain }} for Clang < 16 ==="
target_version=${{ matrix.clang_gcc_toolchain }}
counter=1 # Start with 1 - these will be seen as "GCC version 1, 2, 3" etc
for dir in /usr/lib/gcc/x86_64-linux-gnu/*/; do
if [ -d "$dir" ]; then
version=$(basename "$dir")
# Check if version is numeric and greater than target
if [[ "$version" =~ ^[0-9]+$ ]] && [ "$version" -gt "$target_version" ]; then
echo "Hiding GCC $version -> renaming to $counter (will be seen as GCC version $counter)"
# Safety check: ensure target doesn't already exist
if [ ! -e "/usr/lib/gcc/x86_64-linux-gnu/$counter" ]; then
mv "$dir" "/usr/lib/gcc/x86_64-linux-gnu/$counter"
else
echo "ERROR: Cannot rename GCC $version - /usr/lib/gcc/x86_64-linux-gnu/$counter already exists"
exit 1
fi
counter=$((counter + 1))
fi
fi
done
fi
# Verify what Clang will use
if [ -n "${{ matrix.clang_gcc_toolchain }}" ]; then
echo "=== Verifying GCC toolchain selection ==="
echo "Available GCC versions:"
ls -la /usr/lib/gcc/x86_64-linux-gnu/ | grep -E "^d.*[0-9]+$" || true
echo ""
echo "Clang's detected GCC installation:"
${{ matrix.cxx }} -v -E -x c++ /dev/null -o /dev/null 2>&1 | grep "Found candidate GCC installation" || true
fi
# Install libc++ dev packages if using libc++ (not needed for libstdc++)
if [ "${{ matrix.stdlib }}" = "libcxx" ]; then
apt-get install -y libc++-${{ matrix.compiler_version }}-dev libc++abi-${{ matrix.compiler_version }}-dev
fi
# Install Conan 2
pipx install "conan>=2.0,<3"
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Check environment
run: |
@@ -79,6 +316,14 @@ jobs:
echo "---- Full Environment ----"
env
- name: Get commit message
id: get-commit-message
uses: ./.github/actions/xahau-ga-get-commit-message
with:
event-name: ${{ github.event_name }}
head-commit-message: ${{ github.event.head_commit.message }}
pr-head-sha: ${{ github.event.pull_request.head.sha }}
- name: Install dependencies
uses: ./.github/actions/xahau-ga-dependencies
with:
@@ -87,6 +332,12 @@ jobs:
compiler-id: ${{ matrix.compiler_id }}
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
compiler: ${{ matrix.compiler }}
compiler_version: ${{ matrix.compiler_version }}
cc: ${{ matrix.cc }}
cxx: ${{ matrix.cxx }}
stdlib: ${{ matrix.stdlib }}
gha_cache_enabled: 'false' # Disable caching for self hosted runner
- name: Build
uses: ./.github/actions/xahau-ga-build
@@ -99,6 +350,9 @@ jobs:
compiler-id: ${{ matrix.compiler_id }}
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
stdlib: ${{ matrix.stdlib }}
clang_gcc_toolchain: ${{ matrix.clang_gcc_toolchain || '' }}
ccache_max_size: '100G'
- name: Set artifact name
id: set-artifact-name
@@ -120,4 +374,4 @@ jobs:
else
echo "Error: rippled executable not found in ${{ env.build_dir }}"
exit 1
fi
fi

6
.gitignore vendored
View File

@@ -24,6 +24,11 @@ bin/project-cache.jam
build/docker
# Ignore release builder files
.env
release-build
cmake-*.tar.gz
# Ignore object files.
*.o
build
@@ -71,6 +76,7 @@ docs/html_doc
# Xcode
.DS_Store
*/build/*
!/docs/build/
*.pbxuser
!default.pbxuser
*.mode1v3

View File

@@ -8,6 +8,6 @@
"editor.semanticHighlighting.enabled": true,
"editor.tabSize": 4,
"editor.defaultFormatter": "xaver.clang-format",
"editor.formatOnSave": false
"editor.formatOnSave": true
}
}

View File

@@ -83,28 +83,52 @@ The [commandline](https://xrpl.org/docs/references/http-websocket-apis/api-conve
The `network_id` field was added in the `server_info` response in version 1.5.0 (2019), but it is not returned in [reporting mode](https://xrpl.org/rippled-server-modes.html#reporting-mode). However, use of reporting mode is now discouraged, in favor of using [Clio](https://github.com/XRPLF/clio) instead.
## XRP Ledger server version 2.2.0
## XRP Ledger server version 2.4.0
The following is a non-breaking addition to the API.
As of 2025-01-28, version 2.4.0 is in development. You can use a pre-release version by building from source or [using the `nightly` package](https://xrpl.org/docs/infrastructure/installation/install-rippled-on-ubuntu).
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
### Additions and bugfixes in 2.4.0
### Breaking change in 2.3
- `ledger_entry`: `state` is added an alias for `ripple_state`.
- `ledger_entry`: Enables case-insensitive filtering by canonical name in addition to case-sensitive filtering by RPC name.
- `validators`: Added new field `validator_list_threshold` in response.
- `simulate`: A new RPC that executes a [dry run of a transaction submission](https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0069d-simulate#2-rpc-simulate)
- Signing methods autofill fees better and properly handle transactions that don't have a base fee, and will also autofill the `NetworkID` field.
## XRP Ledger server version 2.3.0
[Version 2.3.0](https://github.com/XRPLF/rippled/releases/tag/2.3.0) was released on Nov 25, 2024.
### Breaking changes in 2.3.0
- `book_changes`: If the requested ledger version is not available on this node, a `ledgerNotFound` error is returned and the node does not attempt to acquire the ledger from the p2p network (as with other non-admin RPCs).
Admins can still attempt to retrieve old ledgers with the `ledger_request` RPC.
### Addition in 2.3
### Additions and bugfixes in 2.3.0
- `book_changes`: Returns a `validated` field in its response, which was missing in prior versions.
The following additions are non-breaking (because they are purely additive).
## XRP Ledger server version 2.2.0
[Version 2.2.0](https://github.com/XRPLF/rippled/releases/tag/2.2.0) was released on Jun 5, 2024. The following additions are non-breaking (because they are purely additive):
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
## XRP Ledger server version 2.0.0
[Version 2.0.0](https://github.com/XRPLF/rippled/releases/tag/2.0.0) was released on Jan 9, 2024. The following additions are non-breaking (because they are purely additive):
- `server_definitions`: A new RPC that generates a `definitions.json`-like output that can be used in XRPL libraries.
- In `Payment` transactions, `DeliverMax` has been added. This is a replacement for the `Amount` field, which should not be used. Typically, the `delivered_amount` (in transaction metadata) should be used. To ease the transition, `DeliverMax` is present regardless of API version, since adding a field is non-breaking.
- API version 2 has been moved from beta to supported, meaning that it is generally available (regardless of the `beta_rpc_api` setting).
## XRP Ledger server version 2.2.0
The following is a non-breaking addition to the API.
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
## XRP Ledger server version 1.12.0
[Version 1.12.0](https://github.com/XRPLF/rippled/releases/tag/1.12.0) was released on Sep 6, 2023. The following additions are non-breaking (because they are purely additive).

320
BUILD.md
View File

@@ -1,7 +1,7 @@
| :warning: **WARNING** :warning:
|---|
| These instructions assume you have a C++ development environment ready with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up on Linux, macOS, or Windows, [see this guide](./docs/build/environment.md). |
> These instructions assume you have a C++ development environment ready
> with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up
> on Linux, macOS, or Windows, see [our guide](./docs/build/environment.md).
>
> These instructions also assume a basic familiarity with Conan and CMake.
> If you are unfamiliar with Conan,
> you can read our [crash course](./docs/build/conan.md)
@@ -10,7 +10,7 @@
## Branches
For a stable release, choose the `master` branch or one of the [tagged
releases](https://github.com/ripple/rippled/releases).
releases](https://github.com/Xahau/xahaud/releases).
```
git checkout master
@@ -29,179 +29,125 @@ branch.
git checkout develop
```
## Minimum Requirements
See [System Requirements](https://xrpl.org/system-requirements.html).
Building rippled generally requires git, Python, Conan, CMake, and a C++ compiler. Some guidance on setting up such a [C++ development environment can be found here](./docs/build/environment.md).
- [Python 3.7](https://www.python.org/downloads/)
- [Conan 1.60](https://conan.io/downloads.html)[^1]
- [Conan 2.x](https://conan.io/downloads)
- [CMake 3.16](https://cmake.org/download/)
[^1]: It is possible to build with Conan 2.x,
but the instructions are significantly different,
which is why we are not recommending it yet.
Notably, the `conan profile update` command is removed in 2.x.
Profiles must be edited by hand.
`rippled` is written in the C++20 dialect and includes the `<concepts>` header.
`xahaud` is written in the C++20 dialect and includes the `<concepts>` header.
The [minimum compiler versions][2] required are:
| Compiler | Version |
|-------------|---------|
| GCC | 11 |
| GCC | 10 |
| Clang | 13 |
| Apple Clang | 13.1.6 |
| MSVC | 19.23 |
### Linux
We don't recommend Windows for `xahaud` production at this time. As of
November 2025, Ubuntu has the highest level of quality assurance, testing,
and support.
The Ubuntu operating system has received the highest level of
quality assurance, testing, and support.
Windows developers should use Visual Studio 2019. `xahaud` isn't
compatible with [Boost](https://www.boost.org/) 1.78 or 1.79, and Conan
can't build earlier Boost versions.
Here are [sample instructions for setting up a C++ development environment on Linux](./docs/build/environment.md#linux).
**Note:** 32-bit Windows development isn't supported.
### Mac
Many rippled engineers use macOS for development.
Here are [sample instructions for setting up a C++ development environment on macOS](./docs/build/environment.md#macos).
### Windows
Windows is not recommended for production use at this time.
- Additionally, 32-bit Windows development is not supported.
[Boost]: https://www.boost.org/
## Steps
### Set Up Conan
After you have a [C++ development environment](./docs/build/environment.md) ready with Git, Python, Conan, CMake, and a C++ compiler, you may need to set up your Conan profile.
These instructions assume a basic familiarity with Conan and CMake.
If you are unfamiliar with Conan, then please read [this crash course](./docs/build/conan.md) or the official [Getting Started][3] walkthrough.
You'll need at least one Conan profile:
1. (Optional) If you've never used Conan, use autodetect to set up a default profile.
```
conan profile new default --detect
conan profile detect --force
```
Update the compiler settings:
2. Update the compiler settings.
For Conan 2, you can edit the profile directly at `~/.conan2/profiles/default`,
or use the Conan CLI. Ensure C++20 is set:
```
conan profile update settings.compiler.cppstd=20 default
conan profile show
```
Configure Conan (1.x only) to use recipe revisions:
Look for `compiler.cppstd=20` in the output. If it's not set, edit the profile:
```
conan config set general.revisions_enabled=1
# Edit ~/.conan2/profiles/default and ensure these settings exist:
[settings]
compiler.cppstd=20
```
**Linux** developers will commonly have a default Conan [profile][] that compiles
with GCC and links with libstdc++.
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
then you will need to choose the `libstdc++11` ABI:
Linux developers will commonly have a default Conan [profile][] that compiles
with GCC and links with libstdc++.
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
then you will need to choose the `libstdc++11` ABI.
```
conan profile update settings.compiler.libcxx=libstdc++11 default
# In ~/.conan2/profiles/default, ensure:
[settings]
compiler.libcxx=libstdc++11
```
On Windows, you should use the x64 native build tools.
An easy way to do that is to run the shortcut "x64 Native Tools Command
Prompt" for the version of Visual Studio that you have installed.
Ensure inter-operability between `boost::string_view` and `std::string_view` types:
```
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_BEAST_USE_STD_STRING_VIEW"]' default
conan profile update 'env.CXXFLAGS="-DBOOST_BEAST_USE_STD_STRING_VIEW"' default
```
If you have other flags in the `conf.tools.build` or `env.CXXFLAGS` sections, make sure to retain the existing flags and append the new ones. You can check them with:
```
conan profile show default
```
**Windows** developers may need to use the x64 native build tools.
An easy way to do that is to run the shortcut "x64 Native Tools Command
Prompt" for the version of Visual Studio that you have installed.
Windows developers must also build `rippled` and its dependencies for the x64
architecture:
Windows developers must also build `xahaud` and its dependencies for the x64
architecture.
```
conan profile update settings.arch=x86_64 default
# In ~/.conan2/profiles/default, ensure:
[settings]
arch=x86_64
```
### Multiple compilers
When `/usr/bin/g++` exists on a platform, it is the default cpp compiler. This
default works for some users.
However, if this compiler cannot build rippled or its dependencies, then you can
install another compiler and set Conan and CMake to use it.
Update the `conf.tools.build:compiler_executables` setting in order to set the correct variables (`CMAKE_<LANG>_COMPILER`) in the
generated CMake toolchain file.
For example, on Ubuntu 20, you may have gcc at `/usr/bin/gcc` and g++ at `/usr/bin/g++`; if that is the case, you can select those compilers with:
```
conan profile update 'conf.tools.build:compiler_executables={"c": "/usr/bin/gcc", "cpp": "/usr/bin/g++"}' default
```
Replace `/usr/bin/gcc` and `/usr/bin/g++` with paths to the desired compilers.
It should choose the compiler for dependencies as well,
but not all of them have a Conan recipe that respects this setting (yet).
For the rest, you can set these environment variables.
Replace `<path>` with paths to the desired compilers:
- `conan profile update env.CC=<path> default`
- `conan profile update env.CXX=<path> default`
Export our [Conan recipe for Snappy](./external/snappy).
It does not explicitly link the C++ standard library,
which allows you to statically link it with GCC, if you want.
3. (Optional) If you have multiple compilers installed on your platform,
make sure that Conan and CMake select the one you want to use.
This setting will set the correct variables (`CMAKE_<LANG>_COMPILER`)
in the generated CMake toolchain file.
```
# Conan 1.x
conan export external/snappy snappy/1.1.10@
# Conan 2.x
conan export --version 1.1.10 external/snappy
# In ~/.conan2/profiles/default, add under [conf] section:
[conf]
tools.build:compiler_executables={"c": "<path>", "cpp": "<path>"}
```
Export our [Conan recipe for RocksDB](./external/rocksdb).
It does not override paths to dependencies when building with Visual Studio.
For setting environment variables for dependencies:
```
# Conan 1.x
conan export external/rocksdb rocksdb/6.29.5@
# Conan 2.x
conan export --version 6.29.5 external/rocksdb
# In ~/.conan2/profiles/default, add under [buildenv] section:
[buildenv]
CC=<path>
CXX=<path>
```
Export our [Conan recipe for SOCI](./external/soci).
It patches their CMake to correctly import its dependencies.
4. Export our [Conan recipe for Snappy](./external/snappy).
It doesn't explicitly link the C++ standard library,
which allows you to statically link it with GCC, if you want.
```
# Conan 1.x
conan export external/soci soci/4.0.3@
# Conan 2.x
conan export --version 4.0.3 external/soci
conan export external/snappy --version 1.1.10 --user xahaud --channel stable
```
Export our [Conan recipe for NuDB](./external/nudb).
It fixes some source files to add missing `#include`s.
5. Export our [Conan recipe for SOCI](./external/soci).
It patches their CMake to correctly import its dependencies.
```
# Conan 1.x
conan export external/nudb nudb/2.0.8@
# Conan 2.x
conan export --version 2.0.8 external/nudb
conan export external/soci --version 4.0.3 --user xahaud --channel stable
```
6. Export our [Conan recipe for WasmEdge](./external/wasmedge).
```
conan export external/wasmedge --version 0.11.2 --user xahaud --channel stable
```
### Build and Test
@@ -239,13 +185,13 @@ It fixes some source files to add missing `#include`s.
generated by the first. You can pass the build type on the command line with
`--settings build_type=$BUILD_TYPE` or in the profile itself,
under the section `[settings]` with the key `build_type`.
If you are using a Microsoft Visual C++ compiler,
then you will need to ensure consistency between the `build_type` setting
and the `compiler.runtime` setting.
When `build_type` is `Release`, `compiler.runtime` should be `MT`.
When `build_type` is `Debug`, `compiler.runtime` should be `MTd`.
```
@@ -257,28 +203,29 @@ It fixes some source files to add missing `#include`s.
`$OUTPUT_FOLDER/build/generators/conan_toolchain.cmake`.
Single-config generators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dxrpld=ON -Dtests=ON ..
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
```
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
and make sure it matches the `build_type` setting you chose in the previous
step.
Multi-config generators:
Multi-config gnerators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -Dxrpld=ON -Dtests=ON ..
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
```
**Note:** You can pass build options for `rippled` in this step.
**Note:** You can pass build options for `xahaud` in this step.
4. Build `rippled`.
4. Build `xahaud`.
For a single-configuration generator, it will build whatever configuration
you passed for `CMAKE_BUILD_TYPE`. For a multi-configuration generator,
you must pass the option `--config` to select the build configuration.
The output file is currently named 'rippled'.
Single-config generators:
@@ -287,13 +234,13 @@ It fixes some source files to add missing `#include`s.
```
Multi-config generators:
```
cmake --build . --config Release
cmake --build . --config Debug
```
5. Test rippled.
5. Test xahaud.
Single-config generators:
@@ -308,79 +255,19 @@ It fixes some source files to add missing `#include`s.
./Debug/rippled --unittest
```
The location of `rippled` in your build directory depends on your CMake
The location of `xahaud` in your build directory depends on your CMake
generator. Pass `--help` to see the rest of the command line options.
## Coverage report
The coverage report is intended for developers using compilers GCC
or Clang (including Apple Clang). It is generated by the build target `coverage`,
which is only enabled when the `coverage` option is set, e.g. with
`--options coverage=True` in `conan` or `-Dcoverage=ON` variable in `cmake`
Prerequisites for the coverage report:
- [gcovr tool][gcovr] (can be installed e.g. with [pip][python-pip])
- `gcov` for GCC (installed with the compiler by default) or
- `llvm-cov` for Clang (installed with the compiler by default)
- `Debug` build type
A coverage report is created when the following steps are completed, in order:
1. `rippled` binary built with instrumentation data, enabled by the `coverage`
option mentioned above
2. completed run of unit tests, which populates coverage capture data
3. completed run of the `gcovr` tool (which internally invokes either `gcov` or `llvm-cov`)
to assemble both instrumentation data and the coverage capture data into a coverage report
The above steps are automated into a single target `coverage`. The instrumented
`rippled` binary can also be used for regular development or testing work, at
the cost of extra disk space utilization and a small performance hit
(to store coverage capture). In case of a spurious failure of unit tests, it is
possible to re-run the `coverage` target without rebuilding the `rippled` binary
(since it is simply a dependency of the coverage report target). It is also possible
to select only specific tests for the purpose of the coverage report, by setting
the `coverage_test` variable in `cmake`
The default coverage report format is `html-details`, but the user
can override it to any of the formats listed in `cmake/CodeCoverage.cmake`
by setting the `coverage_format` variable in `cmake`. It is also possible
to generate more than one format at a time by setting the `coverage_extra_args`
variable in `cmake`. The specific command line used to run the `gcovr` tool will be
displayed if the `CODE_COVERAGE_VERBOSE` variable is set.
By default, the code coverage tool runs parallel unit tests with `--unittest-jobs`
set to the number of available CPU cores. This may cause spurious test
errors on Apple. Developers can override the number of unit test jobs with
the `coverage_test_parallelism` variable in `cmake`.
Example use with some cmake variables set:
```
cd .build
conan install .. --output-folder . --build missing --settings build_type=Debug
cmake -DCMAKE_BUILD_TYPE=Debug -Dcoverage=ON -Dxrpld=ON -Dtests=ON -Dcoverage_test_parallelism=2 -Dcoverage_format=html-details -Dcoverage_extra_args="--json coverage.json" -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
cmake --build . --target coverage
```
After the `coverage` target is completed, the generated coverage report will be
stored inside the build directory, as either of:
- file named `coverage.`_extension_ , with a suitable extension for the report format, or
- directory named `coverage`, with the `index.html` and other files inside, for the `html-details` or `html-nested` report formats.
## Options
| Option | Default Value | Description |
| --- | ---| ---|
| `assert` | OFF | Enable assertions.
| `coverage` | OFF | Prepare the coverage report. |
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
| `tests` | OFF | Build tests. |
| `reporting` | OFF | Build the reporting mode feature. |
| `tests` | ON | Build tests. |
| `unity` | ON | Configure a unity build. |
| `xrpld` | OFF | Build the xrpld (`rippled`) application, and not just the libxrpl library. |
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
[Unity builds][5] may be faster for the first build
(at the cost of much more memory) since they concatenate sources into fewer
@@ -396,35 +283,26 @@ and can be helpful for detecting `#include` omissions.
If you have trouble building dependencies after changing Conan settings,
try removing the Conan cache.
For Conan 2:
```
rm -rf ~/.conan/data
rm -rf ~/.conan2/p
```
Or clear the entire Conan 2 cache:
```
conan cache clean "*"
```
### no std::result_of
### macOS compilation with Apple Clang 17+
If your compiler version is recent enough to have removed `std::result_of` as
part of C++20, e.g. Apple Clang 15.0, then you might need to add a preprocessor
definition to your build.
If you're on macOS with Apple Clang 17 or newer, you need to add a compiler flag to work around a compilation error in gRPC dependencies.
Edit `~/.conan2/profiles/default` and add under the `[conf]` section:
```
conan profile update 'options.boost:extra_b2_flags="define=BOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'conf.tools.build:cflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
```
### call to 'async_teardown' is ambiguous
If you are compiling with an early version of Clang 16, then you might hit
a [regression][6] when compiling C++20 that manifests as an [error in a Boost
header][7]. You can workaround it by adding this preprocessor definition:
```
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
[conf]
tools.build:cxxflags=["-Wno-missing-template-arg-list-after-template-kw"]
```
@@ -579,10 +457,6 @@ but it is more convenient to put them in a [profile][profile].
[1]: https://github.com/conan-io/conan-center-index/issues/13168
[5]: https://en.wikipedia.org/wiki/Unity_build
[6]: https://github.com/boostorg/beast/issues/2648
[7]: https://github.com/boostorg/beast/issues/2661
[gcovr]: https://gcovr.com/en/stable/getting-started.html
[python-pip]: https://packaging.python.org/en/latest/guides/installing-using-pip-and-virtual-environments/
[build_type]: https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
[runtime]: https://cmake.org/cmake/help/latest/variable/CMAKE_MSVC_RUNTIME_LIBRARY.html
[toolchain]: https://cmake.org/cmake/help/latest/manual/cmake-toolchains.7.html

View File

@@ -21,7 +21,7 @@ mkdir results
includes="$( pwd )/results/rawincludes.txt"
pushd ../..
echo Raw includes:
grep -r '#include.*/.*\.h' include src | \
grep -r '^[ ]*#include.*/.*\.h' include src | \
grep -v boost | tee ${includes}
popd
pushd results

View File

@@ -10,8 +10,8 @@ Loop: test.jtx test.toplevel
Loop: test.jtx test.unit_test
test.unit_test == test.jtx
Loop: xrpl.basics xrpl.json
xrpl.json == xrpl.basics
Loop: xrpl.hook xrpld.app
xrpld.app > xrpl.hook
Loop: xrpl.protocol xrpld.app
xrpld.app > xrpl.protocol

View File

@@ -1,5 +1,4 @@
libxrpl.basics > xrpl.basics
libxrpl.basics > xrpl.protocol
libxrpl.crypto > xrpl.basics
libxrpl.json > xrpl.basics
libxrpl.json > xrpl.json
@@ -19,6 +18,7 @@ test.app > xrpl.basics
test.app > xrpld.app
test.app > xrpld.core
test.app > xrpld.ledger
test.app > xrpld.nodestore
test.app > xrpld.overlay
test.app > xrpld.rpc
test.app > xrpl.hook
@@ -85,6 +85,7 @@ test.nodestore > xrpld.core
test.nodestore > xrpld.nodestore
test.nodestore > xrpld.unity
test.overlay > test.jtx
test.overlay > test.toplevel
test.overlay > test.unit_test
test.overlay > xrpl.basics
test.overlay > xrpld.app
@@ -102,6 +103,10 @@ test.protocol > test.toplevel
test.protocol > xrpl.basics
test.protocol > xrpl.json
test.protocol > xrpl.protocol
test.rdb > test.jtx
test.rdb > test.toplevel
test.rdb > xrpld.app
test.rdb > xrpld.core
test.resource > test.unit_test
test.resource > xrpl.basics
test.resource > xrpl.resource
@@ -135,6 +140,8 @@ test.toplevel > test.csf
test.toplevel > xrpl.json
test.unit_test > xrpl.basics
xrpl.hook > xrpl.basics
xrpl.hook > xrpl.protocol
xrpl.json > xrpl.basics
xrpl.protocol > xrpl.basics
xrpl.protocol > xrpl.json
xrpl.resource > xrpl.basics
@@ -148,7 +155,6 @@ xrpld.app > xrpl.basics
xrpld.app > xrpld.conditions
xrpld.app > xrpld.consensus
xrpld.app > xrpld.perflog
xrpld.app > xrpl.hook
xrpld.app > xrpl.json
xrpld.app > xrpl.resource
xrpld.conditions > xrpl.basics

View File

@@ -24,15 +24,33 @@ set(Boost_NO_BOOST_CMAKE ON)
# make GIT_COMMIT_HASH define available to all sources
find_package(Git)
if(Git_FOUND)
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git describe --always --abbrev=40
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git rev-parse HEAD
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE gch)
if(gch)
set(GIT_COMMIT_HASH "${gch}")
message(STATUS gch: ${GIT_COMMIT_HASH})
add_definitions(-DGIT_COMMIT_HASH="${GIT_COMMIT_HASH}")
endif()
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git rev-parse --abbrev-ref HEAD
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE gb)
if(gb)
set(GIT_BRANCH "${gb}")
message(STATUS gb: ${GIT_BRANCH})
add_definitions(-DGIT_BRANCH="${GIT_BRANCH}")
endif()
endif() #git
# make SOURCE_ROOT_PATH define available for logging
set(SOURCE_ROOT_PATH "${CMAKE_CURRENT_SOURCE_DIR}/src/")
add_definitions(-DSOURCE_ROOT_PATH="${SOURCE_ROOT_PATH}")
# BEAST_ENHANCED_LOGGING - adds file:line numbers and formatting to logs
# Automatically enabled for Debug builds via generator expression
# Can be explicitly controlled with -DBEAST_ENHANCED_LOGGING=ON/OFF
option(BEAST_ENHANCED_LOGGING "Include file and line numbers in log messages (auto: Debug=ON, Release=OFF)" OFF)
message(STATUS "BEAST_ENHANCED_LOGGING option: ${BEAST_ENHANCED_LOGGING}")
if(thread_safety_analysis)
add_compile_options(-Wthread-safety -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS -DRIPPLE_ENABLE_THREAD_SAFETY_ANNOTATIONS)
add_compile_options("-stdlib=libc++")
@@ -50,11 +68,6 @@ if(CMAKE_TOOLCHAIN_FILE)
endif()
endif()
if (NOT USE_CONAN)
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake/deps")
endif()
include (CheckCXXCompilerFlag)
include (FetchContent)
include (ExternalProject)
@@ -66,9 +79,7 @@ endif ()
include(RippledSanity)
include(RippledVersion)
include(RippledSettings)
if (NOT USE_CONAN)
include(RippledNIH)
endif()
# this check has to remain in the top-level cmake
# because of the early return statement
if (packages_only)
@@ -80,79 +91,56 @@ endif ()
include(RippledCompiler)
include(RippledInterface)
###
if (NOT USE_CONAN)
set(SECP256K1_INSTALL TRUE)
add_subdirectory(external/secp256k1)
add_library(secp256k1::secp256k1 ALIAS secp256k1)
add_subdirectory(external/ed25519-donna)
include(deps/Boost)
include(deps/OpenSSL)
# include(deps/Secp256k1)
# include(deps/Ed25519-donna)
include(deps/Lz4)
include(deps/Libarchive)
include(deps/Sqlite)
include(deps/Soci)
include(deps/Rocksdb)
include(deps/Nudb)
include(deps/date)
# include(deps/Protobuf)
# include(deps/gRPC)
include(deps/cassandra)
include(deps/Postgres)
include(deps/WasmEdge)
else()
include(conan/Boost)
find_package(OpenSSL 1.1.1 REQUIRED)
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
)
set(SECP256K1_INSTALL TRUE)
add_subdirectory(external/secp256k1)
add_library(secp256k1::secp256k1 ALIAS secp256k1)
add_subdirectory(external/ed25519-donna)
find_package(gRPC REQUIRED)
find_package(lz4 REQUIRED)
# Target names with :: are not allowed in a generator expression.
# We need to pull the include directories and imported location properties
# from separate targets.
find_package(LibArchive REQUIRED)
find_package(SOCI REQUIRED)
find_package(SQLite3 REQUIRED)
find_package(wasmedge REQUIRED)
option(rocksdb "Enable RocksDB" ON)
if(rocksdb)
find_package(RocksDB REQUIRED)
set_target_properties(RocksDB::rocksdb PROPERTIES
INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1
)
target_link_libraries(ripple_libs INTERFACE RocksDB::rocksdb)
endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
find_package(xxHash REQUIRED)
if(TARGET nudb::core)
set(nudb nudb::core)
elseif(TARGET NuDB::nudb)
set(nudb NuDB::nudb)
else()
message(FATAL_ERROR "unknown nudb target")
endif()
target_link_libraries(ripple_libs INTERFACE ${nudb})
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
lz4::lz4
OpenSSL::Crypto
OpenSSL::SSL
# Ripple::grpc_pbufs
# Ripple::pbufs
secp256k1::secp256k1
soci::soci
SQLite::SQLite3
include(deps/Boost)
find_package(OpenSSL 1.1.1 REQUIRED)
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
)
set(SECP256K1_INSTALL TRUE)
add_subdirectory(external/secp256k1)
add_library(secp256k1::secp256k1 ALIAS secp256k1)
add_subdirectory(external/ed25519-donna)
add_subdirectory(external/antithesis-sdk)
find_package(gRPC REQUIRED)
find_package(lz4 REQUIRED)
# Target names with :: are not allowed in a generator expression.
# We need to pull the include directories and imported location properties
# from separate targets.
find_package(LibArchive REQUIRED)
find_package(SOCI REQUIRED)
find_package(SQLite3 REQUIRED)
include(deps/WasmEdge)
option(rocksdb "Enable RocksDB" ON)
if(rocksdb)
find_package(RocksDB REQUIRED)
set_target_properties(RocksDB::rocksdb PROPERTIES
INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1
)
target_link_libraries(ripple_libs INTERFACE RocksDB::rocksdb)
endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
find_package(xxHash REQUIRED)
if(TARGET nudb::core)
set(nudb nudb::core)
elseif(TARGET NuDB::nudb)
set(nudb NuDB::nudb)
else()
message(FATAL_ERROR "unknown nudb target")
endif()
target_link_libraries(ripple_libs INTERFACE ${nudb})
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
lz4::lz4
OpenSSL::Crypto
OpenSSL::SSL
# Ripple::grpc_pbufs
# Ripple::pbufs
secp256k1::secp256k1
soci::soci
SQLite::SQLite3
)
if(coverage)
include(RippledCov)
@@ -160,10 +148,6 @@ endif()
set(PROJECT_EXPORT_SET RippleExports)
include(RippledCore)
if (NOT USE_CONAN)
include(deps/Protobuf)
include(deps/gRPC)
endif()
include(RippledInstall)
include(RippledDocs)
include(RippledValidatorKeys)

View File

@@ -95,6 +95,19 @@ Refer to
["How to Write a Git Commit Message"](https://cbea.ms/git-commit/)
for general rules on writing a good commit message.
tl;dr
> 1. Separate subject from body with a blank line.
> 2. Limit the subject line to 50 characters.
> * [...]shoot for 50 characters, but consider 72 the hard limit.
> 3. Capitalize the subject line.
> 4. Do not end the subject line with a period.
> 5. Use the imperative mood in the subject line.
> * A properly formed Git commit subject line should always be able
> to complete the following sentence: "If applied, this commit will
> _your subject line here_".
> 6. Wrap the body at 72 characters.
> 7. Use the body to explain what and why vs. how.
In addition to those guidelines, please add one of the following
prefixes to the subject line if appropriate.
* `fix:` - The primary purpose is to fix an existing bug.
@@ -341,6 +354,68 @@ pip3 install pre-commit
pre-commit install
```
## Contracts and instrumentation
We are using [Antithesis](https://antithesis.com/) for continuous fuzzing,
and keep a copy of [Antithesis C++ SDK](https://github.com/antithesishq/antithesis-sdk-cpp/)
in `external/antithesis-sdk`. One of the aims of fuzzing is to identify bugs
by finding external conditions which cause contracts violations inside `rippled`.
The contracts are expressed as `XRPL_ASSERT` or `UNREACHABLE` (defined in
`include/xrpl/beast/utility/instrumentation.h`), which are effectively (outside
of Antithesis) wrappers for `assert(...)` with added name. The purpose of name
is to provide contracts with stable identity which does not rely on line numbers.
When `rippled` is built with the Antithesis instrumentation enabled
(using `voidstar` CMake option) and ran on the Antithesis platform, the
contracts become
[test properties](https://antithesis.com/docs/using_antithesis/properties.html);
otherwise they are just like a regular `assert`.
To learn more about Antithesis, see
[How Antithesis Works](https://antithesis.com/docs/introduction/how_antithesis_works.html)
and [C++ SDK](https://antithesis.com/docs/using_antithesis/sdk/cpp/overview.html#)
We continue to use the old style `assert` or `assert(false)` in certain
locations, where the reporting of contract violations on the Antithesis
platform is either not possible or not useful.
For this reason:
* The locations where `assert` or `assert(false)` contracts should continue to be used:
* `constexpr` functions
* unit tests i.e. files under `src/test`
* unit tests-related modules (files under `beast/test` and `beast/unit_test`)
* Outside of the listed locations, do not use `assert`; use `XRPL_ASSERT` instead,
giving it unique name, with the short description of the contract.
* Outside of the listed locations, do not use `assert(false)`; use
`UNREACHABLE` instead, giving it unique name, with the description of the
condition being violated
* The contract name should start with a full name (including scope) of the
function, optionally a named lambda, followed by a colon ` : ` and a brief
(typically at most five words) description. `UNREACHABLE` contracts
can use slightly longer descriptions. If there are multiple overloads of the
function, use common sense to balance both brevity and unambiguity of the
function name. NOTE: the purpose of name is to provide stable means of
unique identification of every contract; for this reason try to avoid elements
which can change in some obvious refactors or when reinforcing the condition.
* Contract description typically (except for `UNREACHABLE`) should describe the
_expected_ condition, as in "I assert that _expected_ is true".
* Contract description for `UNREACHABLE` should describe the _unexpected_
situation which caused the line to have been reached.
* Example good name for an
`UNREACHABLE` macro `"Json::operator==(Value, Value) : invalid type"`; example
good name for an `XRPL_ASSERT` macro `"Json::Value::asCString : valid type"`.
* Example **bad** name
`"RFC1751::insert(char* s, int x, int start, int length) : length is greater than or equal zero"`
(missing namespace, unnecessary full function signature, description too verbose).
Good name: `"ripple::RFC1751::insert : minimum length"`.
* In **few** well-justified cases a non-standard name can be used, in which case a
comment should be placed to explain the rationale (example in `contract.cpp`)
* Do **not** rename a contract without a good reason (e.g. the name no longer
reflects the location or the condition being checked)
* Do not use `std::unreachable`
* Do not put contracts where they can be violated by an external condition
(e.g. timing, data payload before mandatory validation etc.) as this creates
bogus bug reports (and causes crashes of Debug builds)
## Unit Tests
To execute all unit tests:
@@ -413,9 +488,10 @@ existing maintainer without a vote.
## Current Maintainers
* [Richard Holland](https://github.com/RichardAH) (XRPL Labs + XRP Ledger Foundation)
* [Denis Angell](https://github.com/dangell7) (XRPL Labs + XRP Ledger Foundation)
* [Wietse Wind](https://github.com/WietseWind) (XRPL Labs + XRP Ledger Foundation)
* [Richard Holland](https://github.com/RichardAH) (XRPL Labs + INFTF)
* [Denis Angell](https://github.com/dangell7) (XRPL Labs + INFTF)
* [Wietse Wind](https://github.com/WietseWind) (XRPL Labs + INFTF)
* [tequ](https://github.com/tequdev) (Independent + INFTF)
[1]: https://docs.github.com/en/get-started/quickstart/contributing-to-projects

View File

@@ -2,7 +2,7 @@
**Note:** Throughout this README, references to "we" or "our" pertain to the community and contributors involved in the Xahau network. It does not imply a legal entity or a specific collection of individuals.
[Xahau](https://xahau.network/) is a decentralized cryptographic ledger that builds upon the robust foundation of the XRP Ledger. It inherits the XRP Ledger's Byzantine Fault Tolerant consensus algorithm and enhances it with additional features and functionalities. Developers and users familiar with the XRP Ledger will find that most documentation and tutorials available on [xrpl.org](https://xrpl.org) are relevant and applicable to Xahau, including those related to running validators and managing validator keys. For Xahau specific documentation you can visit our [documentation](https://docs.xahau.network/)
[Xahau](https://xahau.network/) is a decentralized cryptographic ledger that builds upon the robust foundation of the XRP Ledger. It inherits the XRP Ledger's Byzantine Fault Tolerant consensus algorithm and enhances it with additional features and functionalities. Developers and users familiar with the XRP Ledger will find that most documentation and tutorials available on [xrpl.org](https://xrpl.org) are relevant and applicable to Xahau, including those related to running validators and managing validator keys. For Xahau specific documentation you can visit our [documentation](https://xahau.network/)
## XAH
XAH is the public, counterparty-free asset native to Xahau and functions primarily as network gas. Transactions submitted to the Xahau network must supply an appropriate amount of XAH, to be burnt by the network as a fee, in order to be successfully included in a validated ledger. In addition, XAH also acts as a bridge currency within the Xahau DEX. XAH is traded on the open-market and is available for anyone to access. Xahau was created in 2023 with a supply of 600 million units of XAH.
@@ -12,7 +12,7 @@ The server software that powers Xahau is called `xahaud` and is available in thi
### Build from Source
* [Read the build instructions in our documentation](https://docs.xahau.network/infrastructure/building-xahau)
* [Read the build instructions in our documentation](https://xahau.network/infrastructure/building-xahau)
* If you encounter any issues, please [open an issue](https://github.com/xahau/xahaud/issues)
## Highlights of Xahau
@@ -58,9 +58,9 @@ git-subtree. See those directories' README files for more details.
- **Documentation**: Documentation for XRPL, Xahau and Hooks.
- [Xrpl Documentation](https://xrpl.org)
- [Xahau Documentation](https://docs.xahau.network/)
- [Xahau Documentation](https://xahau.network/)
- [Hooks Technical Documentation](https://xrpl-hooks.readme.io/)
- **Explorers**: Explore the Xahau ledger using various explorers:
- **Explorers**: Explore the Xahau Network using various explorers:
- [xahauexplorer.com](https://xahauexplorer.com)
- [xahscan.com](https://xahscan.com)
- [xahau.xrpl.org](https://xahau.xrpl.org)

View File

@@ -7,6 +7,135 @@ This document contains the release notes for `rippled`, the reference server imp
Have new ideas? Need help with setting up your node? [Please open an issue here](https://github.com/xrplf/rippled/issues/new/choose).
# Version 2.3.1
Version 2.3.1 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available.
This is a hotfix release that includes the following updates:
- Fix an erroneous high fee penalty that peers could incur for sending older transactions.
- Update to the fees charged for imposing a load on the server.
- Prevent the relaying of internal pseudo-transactions.
- Before: Pseudo-transactions received from a peer will fail the signature check, even if they were requested (using TMGetObjectByHash) because they have no signature. This causes the peer to be charged for an invalid signature.
- After: Pseudo-transactions, are put into the global cache (TransactionMaster) only. If the transaction is not part of a TMTransactions batch, the peer is charged an unwanted data fee. These fees will not be a problem in the normal course of operations but should dissuade peers from behaving badly by sending a bunch of junk.
- Improved logging now specifies the reason for the fee charged to the peer.
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
<!-- BREAK -->
## Action Required
If you run an XRP Ledger validator, upgrade to version 2.3.1 as soon as possible to ensure stable and uninterrupted network behavior.
## Changelog
### Amendments and New Features
- None
### Bug Fixes and Performance Improvements
- Change the charged fee for sending older transactions from feeInvalidSignature to feeUnwantedData. [#5243](https://github.com/XRPLF/rippled/pull/5243)
### Docs and Build System
- None
### GitHub
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
## Credits
The following people contributed directly to this release:
Ed Hennis <ed@ripple.com>
JoelKatz <DavidJoelSchwartz@GMail.com>
Sophia Xie <106177003+sophiax851@users.noreply.github.com>
Valentin Balaschenko <13349202+vlntb@users.noreply.github.com>
Bug Bounties and Responsible Disclosures:
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
To report a bug, please send a detailed report to: <bugs@xrpl.org>
# Version 2.3.0
Version 2.3.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release includes 8 new amendments, including Multi-Purpose Tokens, Credentials, Clawback support for AMMs, and the ability to make offers as part of minting NFTs. Additionally, this release includes important fixes for stability, so server operators are encouraged to upgrade as soon as possible.
## Action Required
If you run an XRP Ledger server, upgrade to version 2.3.0 as soon as possible to ensure service continuity.
Additionally, new amendments are now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
## Full Changelog
### Amendments
The following amendments are open for voting with this release:
- **XLS-70 Credentials** - Users can issue Credentials on the ledger and use Credentials to pre-approve incoming payments when using Deposit Authorization instead of individually approving payers. ([#5103](https://github.com/XRPLF/rippled/pull/5103))
- related fix: #5189 (https://github.com/XRPLF/rippled/pull/5189)
- **XLS-33 Multi-Purpose Tokens** - A new type of fungible token optimized for institutional DeFi including stablecoins. ([#5143](https://github.com/XRPLF/rippled/pull/5143))
- **XLS-37 AMM Clawback** - Allows clawback-enabled tokens to be used in AMMs, with appropriate guardrails. ([#5142](https://github.com/XRPLF/rippled/pull/5142))
- **XLS-52 NFTokenMintOffer** - Allows creating an NFT sell offer as part of minting a new NFT. ([#4845](https://github.com/XRPLF/rippled/pull/4845))
- **fixAMMv1_2** - Fixes two bugs in Automated Market Maker (AMM) transaction processing. ([#5176](https://github.com/XRPLF/rippled/pull/5176))
- **fixNFTokenPageLinks** - Fixes a bug that can cause NFT directories to have missing links, and introduces a transaction to repair corrupted ledger state. ([#4945](https://github.com/XRPLF/rippled/pull/4945))
- **fixEnforceNFTokenTrustline** - Fixes two bugs in the interaction between NFT offers and trust lines. ([#4946](https://github.com/XRPLF/rippled/pull/4946))
- **fixInnerObjTemplate2** - Standardizes the way inner objects are enforced across all transaction and ledger data. ([#5047](https://github.com/XRPLF/rippled/pull/5047))
The following amendment is partially implemented but not open for voting:
- **InvariantsV1_1** - Adds new invariants to ensure transactions process as intended, starting with an invariant to ensure that ledger entries owned by an account are deleted when the account is deleted. ([#4663](https://github.com/XRPLF/rippled/pull/4663))
### New Features
- Allow configuration of SQLite database page size. ([#5135](https://github.com/XRPLF/rippled/pull/5135), [#5140](https://github.com/XRPLF/rippled/pull/5140))
- In the `libxrpl` C++ library, provide a list of known amendments. ([#5026](https://github.com/XRPLF/rippled/pull/5026))
### Deprecations
- History Shards are removed. ([#5066](https://github.com/XRPLF/rippled/pull/5066))
- Reporting mode is removed. ([#5092](https://github.com/XRPLF/rippled/pull/5092))
For users wanting to store more ledger history, it is recommended to run a Clio server instead.
### Bug fixes
- Fix a crash in debug builds when amm_info request contains an invalid AMM account ID. ([#5188](https://github.com/XRPLF/rippled/pull/5188))
- Fix a crash caused by a race condition in peer-to-peer code. ([#5071](https://github.com/XRPLF/rippled/pull/5071))
- Fix a crash in certain situations
- Fix several bugs in the book_changes API method. ([#5096](https://github.com/XRPLF/rippled/pull/5096))
- Fix bug triggered by providing an invalid marker to the account_nfts API method. ([#5045](https://github.com/XRPLF/rippled/pull/5045))
- Accept lower-case hexadecimal in compact transaction identifier (CTID) parameters in API methods. ([#5049](https://github.com/XRPLF/rippled/pull/5049))
- Disallow filtering by types that an account can't own in the account_objects API method. ([#5056](https://github.com/XRPLF/rippled/pull/5056))
- Fix error code returned by the feature API method when providing an invalid parameter. ([#5063](https://github.com/XRPLF/rippled/pull/5063))
- (API v3) Fix error code returned by amm_info when providing invalid parameters. ([#4924](https://github.com/XRPLF/rippled/pull/4924))
### Other Improvements
- Adds a new default hub, hubs.xrpkuwait.com, to the config file and bootstrapping code. ([#5169](https://github.com/XRPLF/rippled/pull/5169))
- Improve error message when commandline interface fails with `rpcInternal` because there was no response from the server. ([#4959](https://github.com/XRPLF/rippled/pull/4959))
- Add tools for debugging specific transactions via replay. ([#5027](https://github.com/XRPLF/rippled/pull/5027), [#5087](https://github.com/XRPLF/rippled/pull/5087))
- Major reorganization of source code files. ([#4997](https://github.com/XRPLF/rippled/pull/4997))
- Add new unit tests. ([#4886](https://github.com/XRPLF/rippled/pull/4886))
- Various improvements to build tools and contributor documentation. ([#5001](https://github.com/XRPLF/rippled/pull/5001), [#5028](https://github.com/XRPLF/rippled/pull/5028), [#5052](https://github.com/XRPLF/rippled/pull/5052), [#5091](https://github.com/XRPLF/rippled/pull/5091), [#5084](https://github.com/XRPLF/rippled/pull/5084), [#5120](https://github.com/XRPLF/rippled/pull/5120), [#5010](https://github.com/XRPLF/rippled/pull/5010). [#5055](https://github.com/XRPLF/rippled/pull/5055), [#5067](https://github.com/XRPLF/rippled/pull/5067), [#5061](https://github.com/XRPLF/rippled/pull/5061), [#5072](https://github.com/XRPLF/rippled/pull/5072), [#5044](https://github.com/XRPLF/rippled/pull/5044) )
- Various code cleanup and refactoring. ([#4509](https://github.com/XRPLF/rippled/pull/4509), [#4521](https://github.com/XRPLF/rippled/pull/4521), [#4856](https://github.com/XRPLF/rippled/pull/4856), [#5190](https://github.com/XRPLF/rippled/pull/5190), [#5081](https://github.com/XRPLF/rippled/pull/5081), [#5053](https://github.com/XRPLF/rippled/pull/5053), [#5058](https://github.com/XRPLF/rippled/pull/5058), [#5122](https://github.com/XRPLF/rippled/pull/5122), [#5059](https://github.com/XRPLF/rippled/pull/5059), [#5041](https://github.com/XRPLF/rippled/pull/5041))
Bug Bounties and Responsible Disclosures:
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
To report a bug, please send a detailed report to: <bugs@xrpl.org>
# Version 2.2.3
Version 2.2.3 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release fixes a problem that can cause full-history servers to run out of space in their SQLite databases, depending on configuration. There are no new amendments in this release.

View File

@@ -62,11 +62,11 @@ For these complaints or reports, please [contact our support team](mailto:bugs@x
### The following type of security problems are excluded
1. **In scope**. Only bugs in software under the scope of the program qualify. Currently, that means `xahaud` and `xahau-lib`.
2. **Relevant**. A security issue, posing a danger to user funds, privacy or the operation of the Xahau Ledger.
2. **Relevant**. A security issue, posing a danger to user funds, privacy or the operation of the Xahau Network.
3. **Original and previously unknown**. Bugs that are already known and discussed in public do not qualify. Previously reported bugs, even if publicly unknown, are not eligible.
4. **Specific**. We welcome general security advice or recommendations, but we cannot pay bounties for that.
5. **Fixable**. There has to be something we can do to permanently fix the problem. Note that bugs in other peoples software may still qualify in some cases. For example, if you find a bug in a library that we use which can compromise the security of software that is in scope and we can get it fixed, you may qualify for a bounty.
6. **Unused**. If you use the exploit to attack the Xahau Ledger, you do not qualify for a bounty. If you report a vulnerability used in an ongoing or past attack and there is specific, concrete evidence that suggests you are the attacker we reserve the right not to pay a bounty.
6. **Unused**. If you use the exploit to attack the Xahau Network, you do not qualify for a bounty. If you report a vulnerability used in an ongoing or past attack and there is specific, concrete evidence that suggests you are the attacker we reserve the right not to pay a bounty.
Please note: Reports that are lacking any proof (such as screenshots or other data), detailed information or details on how to reproduce any unexpected result will be investigated but will not be eligible for any reward.

View File

@@ -5,8 +5,6 @@
# debugging.
set -ex
set -e
echo "START INSIDE CONTAINER - CORE"
echo "-- BUILD CORES: $3"
@@ -14,6 +12,13 @@ echo "-- GITHUB_REPOSITORY: $1"
echo "-- GITHUB_SHA: $2"
echo "-- GITHUB_RUN_NUMBER: $4"
# Use mounted filesystem for temp files to avoid container space limits
export TMPDIR=/io/tmp
export TEMP=/io/tmp
export TMP=/io/tmp
mkdir -p /io/tmp
echo "=== Using temp directory: /io/tmp ==="
umask 0000;
cd /io/ &&
@@ -27,7 +32,8 @@ if [[ "$?" -ne "0" ]]; then
exit 127
fi
perl -i -pe "s/^(\\s*)-DBUILD_SHARED_LIBS=OFF/\\1-DBUILD_SHARED_LIBS=OFF\\n\\1-DROCKSDB_BUILD_SHARED=OFF/g" cmake/deps/Rocksdb.cmake &&
BUILD_TYPE=Release
mv cmake/deps/WasmEdge.cmake cmake/deps/WasmEdge.old &&
echo "find_package(LLVM REQUIRED CONFIG)
message(STATUS \"Found LLVM \${LLVM_PACKAGE_VERSION}\")
@@ -38,18 +44,47 @@ target_link_libraries (ripple_libs INTERFACE wasmedge)
add_library (wasmedge::wasmedge ALIAS wasmedge)
message(\"WasmEdge DONE\")
" > cmake/deps/WasmEdge.cmake &&
git checkout src/ripple/protocol/impl/BuildInfo.cpp &&
sed -i s/\"0.0.0\"/\"$(date +%Y).$(date +%-m).$(date +%-d)-$(git rev-parse --abbrev-ref HEAD)+$4\"/g src/ripple/protocol/impl/BuildInfo.cpp &&
export LDFLAGS="-static-libstdc++"
export CMAKE_EXE_LINKER_FLAGS="-static-libstdc++"
export CMAKE_STATIC_LINKER_FLAGS="-static-libstdc++"
git config --global --add safe.directory /io &&
git checkout src/libxrpl/protocol/BuildInfo.cpp &&
sed -i s/\"0.0.0\"/\"$(date +%Y).$(date +%-m).$(date +%-d)-$(git rev-parse --abbrev-ref HEAD)$(if [ -n "$4" ]; then echo "+$4"; fi)\"/g src/libxrpl/protocol/BuildInfo.cpp &&
conan export external/snappy --version 1.1.10 --user xahaud --channel stable &&
conan export external/soci --version 4.0.3 --user xahaud --channel stable &&
conan export external/wasmedge --version 0.11.2 --user xahaud --channel stable &&
cd release-build &&
cmake .. -DCMAKE_BUILD_TYPE=Release -DBoost_NO_BOOST_CMAKE=ON -DLLVM_DIR=/usr/lib64/llvm13/lib/cmake/llvm/ -DLLVM_LIBRARY_DIR=/usr/lib64/llvm13/lib/ -DWasmEdge_LIB=/usr/local/lib64/libwasmedge.a &&
make -j$3 VERBOSE=1 &&
# Install dependencies - tool_requires in conanfile.py handles glibc 2.28 compatibility
# for build tools (protoc, grpc plugins, b2) in HBB environment
# The tool_requires('b2/5.3.2') in conanfile.py should force b2 to build from source
# with the correct toolchain, avoiding the GLIBCXX_3.4.29 issue
echo "=== Installing dependencies ===" &&
conan install .. --output-folder . --build missing --settings build_type=$BUILD_TYPE \
-o with_wasmedge=False -o tool_requires_b2=True &&
cmake .. -G Ninja \
-DCMAKE_BUILD_TYPE=$BUILD_TYPE \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_EXE_LINKER_FLAGS="-static-libstdc++" \
-DLLVM_DIR=$LLVM_DIR \
-DWasmEdge_LIB=$WasmEdge_LIB \
-Dxrpld=TRUE \
-Dtests=TRUE &&
ccache -z &&
ninja -j $3 && echo "=== Re-running final link with verbose output ===" && rm -f rippled && ninja -v rippled &&
ccache -s &&
strip -s rippled &&
mv rippled xahaud &&
echo "=== Full ldd output ===" &&
ldd xahaud &&
echo "=== Running libcheck ===" &&
libcheck xahaud &&
echo "Build host: `hostname`" > release.info &&
echo "Build date: `date`" >> release.info &&
echo "Build md5: `md5sum xahaud`" >> release.info &&
echo "Git remotes:" >> release.info &&
git remote -v >> release.info
git remote -v >> release.info &&
echo "Git status:" >> release.info &&
git status -v >> release.info &&
echo "Git log [last 20]:" >> release.info &&
@@ -68,9 +103,9 @@ fi
cd ..;
mv src/ripple/net/impl/RegisterSSLCerts.cpp.old src/ripple/net/impl/RegisterSSLCerts.cpp;
mv cmake/deps/Rocksdb.cmake.old cmake/deps/Rocksdb.cmake;
mv src/xrpld/net/detail/RegisterSSLCerts.cpp.old src/xrpld/net/detail/RegisterSSLCerts.cpp;
mv cmake/deps/WasmEdge.old cmake/deps/WasmEdge.cmake;
rm src/certs/certbundle.h;
git checkout src/libxrpl/protocol/BuildInfo.cpp;
echo "END INSIDE CONTAINER - CORE"

View File

@@ -3,8 +3,6 @@
# processes launched or upon any unbound variable.
# We use set -x to print commands before running them to help with
# debugging.
set -ex
set -e
echo "START INSIDE CONTAINER - FULL"
@@ -16,21 +14,14 @@ echo "-- GITHUB_RUN_NUMBER: $4"
umask 0000;
echo "Fixing CentOS 7 EOL"
sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-*
sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-*
yum clean all
yum-config-manager --disable centos-sclo-sclo
####
cd /io;
mkdir -p src/certs;
curl --silent -k https://raw.githubusercontent.com/RichardAH/rippled-release-builder/main/ca-bundle/certbundle.h -o src/certs/certbundle.h;
if [ "`grep certbundle.h src/ripple/net/impl/RegisterSSLCerts.cpp | wc -l`" -eq "0" ]
if [ "`grep certbundle.h src/xrpld/net/detail/RegisterSSLCerts.cpp | wc -l`" -eq "0" ]
then
cp src/ripple/net/impl/RegisterSSLCerts.cpp src/ripple/net/impl/RegisterSSLCerts.cpp.old
cp src/xrpld/net/detail/RegisterSSLCerts.cpp src/xrpld/net/detail/RegisterSSLCerts.cpp.old
perl -i -pe "s/^{/{
#ifdef EMBEDDED_CA_BUNDLE
BIO *cbio = BIO_new_mem_buf(ca_bundle.data(), ca_bundle.size());
@@ -70,95 +61,18 @@ then
BIO_free(cbio);
}
}
#endif/g" src/ripple/net/impl/RegisterSSLCerts.cpp &&
sed -i "s/#include <ripple\/net\/RegisterSSLCerts.h>/\0\n#include <certs\/certbundle.h>/g" src/ripple/net/impl/RegisterSSLCerts.cpp
#endif/g" src/xrpld/net/detail/RegisterSSLCerts.cpp &&
sed -i "s/#include <xrpld\/net\/RegisterSSLCerts.h>/\0\n#include <certs\/certbundle.h>/g" src/xrpld/net/detail/RegisterSSLCerts.cpp
fi
mkdir -p .nih_c;
mkdir -p .nih_toolchain;
cd .nih_toolchain &&
yum install -y wget lz4 lz4-devel git llvm13-static.x86_64 llvm13-devel.x86_64 devtoolset-10-binutils zlib-static ncurses-static -y \
devtoolset-7-gcc-c++ \
devtoolset-9-gcc-c++ \
devtoolset-10-gcc-c++ \
snappy snappy-devel \
zlib zlib-devel \
lz4-devel \
libasan &&
export PATH=`echo $PATH | sed -E "s/devtoolset-9/devtoolset-7/g"` &&
echo "-- Install ZStd 1.1.3 --" &&
yum install epel-release -y &&
ZSTD_VERSION="1.1.3" &&
( wget -nc -q -O zstd-${ZSTD_VERSION}.tar.gz https://github.com/facebook/zstd/archive/v${ZSTD_VERSION}.tar.gz; echo "" ) &&
tar xzvf zstd-${ZSTD_VERSION}.tar.gz &&
cd zstd-${ZSTD_VERSION} &&
make -j$3 install &&
cd .. &&
echo "-- Install Cmake 3.23.1 --" &&
pwd &&
( wget -nc -q https://github.com/Kitware/CMake/releases/download/v3.23.1/cmake-3.23.1-linux-x86_64.tar.gz; echo "" ) &&
tar -xzf cmake-3.23.1-linux-x86_64.tar.gz -C /hbb/ &&
echo "-- Install Boost 1.86.0 --" &&
pwd &&
( wget -nc -q https://archives.boost.io/release/1.86.0/source/boost_1_86_0.tar.gz; echo "" ) &&
tar -xzf boost_1_86_0.tar.gz &&
cd boost_1_86_0 && ./bootstrap.sh && ./b2 link=static -j$3 && ./b2 install &&
cd ../ &&
echo "-- Install Protobuf 3.20.0 --" &&
pwd &&
( wget -nc -q https://github.com/protocolbuffers/protobuf/releases/download/v3.20.0/protobuf-all-3.20.0.tar.gz; echo "" ) &&
tar -xzf protobuf-all-3.20.0.tar.gz &&
cd protobuf-3.20.0/ &&
./autogen.sh && ./configure --prefix=/usr --disable-shared link=static && make -j$3 && make install &&
cd .. &&
echo "-- Build LLD --" &&
pwd &&
ln /usr/bin/llvm-config-13 /usr/bin/llvm-config &&
mv /opt/rh/devtoolset-9/root/usr/bin/ar /opt/rh/devtoolset-9/root/usr/bin/ar-9 &&
ln /opt/rh/devtoolset-10/root/usr/bin/ar /opt/rh/devtoolset-9/root/usr/bin/ar &&
( wget -nc -q https://github.com/llvm/llvm-project/releases/download/llvmorg-13.0.1/lld-13.0.1.src.tar.xz; echo "" ) &&
( wget -nc -q https://github.com/llvm/llvm-project/releases/download/llvmorg-13.0.1/libunwind-13.0.1.src.tar.xz; echo "" ) &&
tar -xf lld-13.0.1.src.tar.xz &&
tar -xf libunwind-13.0.1.src.tar.xz &&
cp -r libunwind-13.0.1.src/include libunwind-13.0.1.src/src lld-13.0.1.src/ &&
cd lld-13.0.1.src &&
rm -rf build CMakeCache.txt &&
mkdir -p build &&
cd build &&
cmake .. -DLLVM_LIBRARY_DIR=/usr/lib64/llvm13/lib/ -DCMAKE_INSTALL_PREFIX=/usr/lib64/llvm13/ -DCMAKE_BUILD_TYPE=Release &&
make -j$3 install &&
ln -s /usr/lib64/llvm13/lib/include/lld /usr/include/lld &&
cp /usr/lib64/llvm13/lib/liblld*.a /usr/local/lib/ &&
cd ../../ &&
echo "-- Build WasmEdge --" &&
( wget -nc -q https://github.com/WasmEdge/WasmEdge/archive/refs/tags/0.11.2.zip; unzip -o 0.11.2.zip; ) &&
cd WasmEdge-0.11.2 &&
( mkdir -p build; echo "" ) &&
cd build &&
export BOOST_ROOT="/usr/local/src/boost_1_86_0" &&
export Boost_LIBRARY_DIRS="/usr/local/lib" &&
export BOOST_INCLUDEDIR="/usr/local/src/boost_1_86_0" &&
export PATH=`echo $PATH | sed -E "s/devtoolset-7/devtoolset-9/g"` &&
cmake .. \
-DCMAKE_BUILD_TYPE=Release \
-DWASMEDGE_BUILD_SHARED_LIB=OFF \
-DWASMEDGE_BUILD_STATIC_LIB=ON \
-DWASMEDGE_BUILD_AOT_RUNTIME=ON \
-DWASMEDGE_FORCE_DISABLE_LTO=ON \
-DCMAKE_POSITION_INDEPENDENT_CODE=ON \
-DWASMEDGE_LINK_LLVM_STATIC=ON \
-DWASMEDGE_BUILD_PLUGINS=OFF \
-DWASMEDGE_LINK_TOOLS_STATIC=ON \
-DBoost_NO_BOOST_CMAKE=ON -DLLVM_DIR=/usr/lib64/llvm13/lib/cmake/llvm/ -DLLVM_LIBRARY_DIR=/usr/lib64/llvm13/lib/ &&
make -j$3 install &&
export PATH=`echo $PATH | sed -E "s/devtoolset-9/devtoolset-10/g"` &&
cp -r include/api/wasmedge /usr/include/ &&
cd /io/ &&
# Environment setup moved to Dockerfile in release-builder.sh
source /opt/rh/gcc-toolset-11/enable
export PATH=/usr/local/bin:$PATH
export CC='/usr/lib64/ccache/gcc' &&
export CXX='/usr/lib64/ccache/g++' &&
echo "-- Build Rippled --" &&
pwd &&
cp cmake/deps/Rocksdb.cmake cmake/deps/Rocksdb.cmake.old &&
echo "MOVING TO [ build-core.sh ]"
cd /io;
echo "MOVING TO [ build-core.sh ]";
printenv > .env.temp;
cat .env.temp | grep '=' | sed s/\\\(^[^=]\\+=\\\)/\\1\\\"/g|sed s/\$/\\\"/g > .env;

View File

@@ -1,7 +1,7 @@
#
# Default validators.txt
#
# This file is located in the same folder as your rippled.cfg file
# This file is located in the same folder as your xahaud.cfg file
# and defines which validators your server trusts not to collude.
#
# This file is UTF-8 with DOS, UNIX, or Mac style line endings.
@@ -17,18 +17,17 @@
# See validator_list_sites and validator_list_keys below.
#
# Examples:
# n9KorY8QtTdRx7TVDpwnG9NvyxsDwHUKUEeDLY3AkiGncVaSXZi5
# n9MqiExBcoG19UXwoLjBJnhsxEhAZMuWwJDRdkyDz1EkEkwzQTNt
# n9L3GdotB8a3AqtsvS7NXt4BUTQSAYyJUr9xtFj2qXJjfbZsawKY
# n9M7G6eLwQtUjfCthWUmTN8L4oEZn1sNr46yvKrpsq58K1C6LAxz
#
# [validator_list_sites]
#
# List of URIs serving lists of recommended validators.
#
# Examples:
# https://vl.ripple.com
# https://vl.xrplf.org
# https://vl.xahau.org
# http://127.0.0.1:8000
# file:///etc/opt/ripple/vl.txt
# file:///etc/opt/xahaud/vl.txt
#
# [validator_list_keys]
#
@@ -39,50 +38,66 @@
# Validator list keys should be hex-encoded.
#
# Examples:
# ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
# ED307A760EE34F2D0CAA103377B1969117C38B8AA0AA1E2A24DAC1F32FC97087ED
# EDA46E9C39B1389894E690E58914DC1029602870370A0993E5B87C4A24EAF4A8E8
#
# [import_vl_keys]
#
# This section is used to import the public keys of trusted validator list publishers.
# The keys are used to authenticate and accept new lists of trusted validators.
# In this example, the key for the publisher "vl.xrplf.org" is imported.
# Each key is represented as a hexadecimal string.
#
# Examples:
# ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
# ED45D1840EE724BE327ABE9146503D5848EFD5F38B6D5FEDE71E80ACCE5E6E738B
# ED42AEC58B701EEBB77356FFFEC26F83C1F0407263530F068C7C73D392C7E06FD1
# ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
# The default validator list publishers that the rippled instance
# The default validator list publishers that the xahaud instance
# trusts.
#
# WARNING: Changing these values can cause your rippled instance to see a
# validated ledger that contradicts other rippled instances'
# WARNING: Changing these values can cause your xahaud instance to see a
# validated ledger that contradicts other xahaud instances'
# validated ledgers (aka a ledger fork) if your validator list(s)
# do not sufficiently overlap with the list(s) used by others.
# See: https://arxiv.org/pdf/1802.07242.pdf
[validator_list_sites]
https://vl.ripple.com
https://vl.xrplf.org
https://vl.xahau.org
[validator_list_keys]
#vl.ripple.com
ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
# vl.xrplf.org
ED45D1840EE724BE327ABE9146503D5848EFD5F38B6D5FEDE71E80ACCE5E6E738B
# vl.xahau.org
EDA46E9C39B1389894E690E58914DC1029602870370A0993E5B87C4A24EAF4A8E8
[import_vl_keys]
# vl.xrplf.org
ED45D1840EE724BE327ABE9146503D5848EFD5F38B6D5FEDE71E80ACCE5E6E738B
ED42AEC58B701EEBB77356FFFEC26F83C1F0407263530F068C7C73D392C7E06FD1
ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
# To use the test network (see https://xrpl.org/connect-your-rippled-to-the-xrp-test-net.html),
# To use the test network (see https://xahau.network/docs/infrastructure/installing-xahaud),
# use the following configuration instead:
#
# [validator_list_sites]
# https://vl.altnet.rippletest.net
#
# [validator_list_keys]
# ED264807102805220DA0F312E71FC2C69E1552C9C5790F6C25E3729DEB573D5860
# [validators]
# nHBoJCE3wPgkTcrNPMHyTJFQ2t77EyCAqcBRspFCpL6JhwCm94VZ
# nHUVv4g47bFMySAZFUKVaXUYEmfiUExSoY4FzwXULNwJRzju4XnQ
# nHBvr8avSFTz4TFxZvvi4rEJZZtyqE3J6KAAcVWVtifsE7edPM7q
# nHUH3Z8TRU57zetHbEPr1ynyrJhxQCwrJvNjr4j1SMjYADyW1WWe
#
# [import_vl_keys]
# ED264807102805220DA0F312E71FC2C69E1552C9C5790F6C25E3729DEB573D5860
# [validator_list_threshold]
#
# Minimum number of validator lists on which a validator must be listed in
# order to be used.
#
# This can be set explicitly to any positive integer number not greater than
# the size of [validator_list_keys]. If it is not set, or set to 0, the
# value will be calculated at startup from the size of [validator_list_keys],
# where the calculation is:
#
# threshold = size(validator_list_keys) < 3
# ? 1
# : floor(size(validator_list_keys) / 2) + 1
[validator_list_threshold]
0

View File

@@ -9,11 +9,11 @@
#
# 2. Peer Protocol
#
# 3. Ripple Protocol
# 3. XRPL Protocol
#
# 4. HTTPS Client
#
# 5. <vacated>
# 5. Reporting Mode
#
# 6. Database
#
@@ -29,18 +29,17 @@
#
# Purpose
#
# This file documents and provides examples of all rippled server process
# configuration options. When the rippled server instance is launched, it
# This file documents and provides examples of all xahaud server process
# configuration options. When the xahaud server instance is launched, it
# looks for a file with the following name:
#
# rippled.cfg
# xahaud.cfg
#
# For more information on where the rippled server instance searches for the
# file, visit:
# To run xahaud with a custom configuration file, use the "--conf {file}" flag.
# By default, xahaud will look in the local working directory or the home directory.
#
# https://xrpl.org/commandline-usage.html#generic-options
#
# This file should be named rippled.cfg. This file is UTF-8 with DOS, UNIX,
# This file should be named xahaud.cfg. This file is UTF-8 with DOS, UNIX,
# or Mac style end of lines. Blank lines and lines beginning with '#' are
# ignored. Undefined sections are reserved. No escapes are currently defined.
#
@@ -89,8 +88,8 @@
#
#
#
# rippled offers various server protocols to clients making inbound
# connections. The listening ports rippled uses are "universal" ports
# xahaud offers various server protocols to clients making inbound
# connections. The listening ports xahaud uses are "universal" ports
# which may be configured to handshake in one or more of the available
# supported protocols. These universal ports simplify administration:
# A single open port can be used for multiple protocols.
@@ -103,7 +102,7 @@
#
# A list of port names and key/value pairs. A port name must start with a
# letter and contain only letters and numbers. The name is not case-sensitive.
# For each name in this list, rippled will look for a configuration file
# For each name in this list, xahaud will look for a configuration file
# section with the same name and use it to create a listening port. The
# name is informational only; the choice of name does not affect the function
# of the listening port.
@@ -134,7 +133,7 @@
# ip = 127.0.0.1
# protocol = http
#
# When rippled is used as a command line client (for example, issuing a
# When xahaud is used as a command line client (for example, issuing a
# server stop command), the first port advertising the http or https
# protocol will be used to make the connection.
#
@@ -175,7 +174,7 @@
# same time. It is possible have both Websockets and Secure Websockets
# together in one port.
#
# NOTE If no ports support the peer protocol, rippled cannot
# NOTE If no ports support the peer protocol, xahaud cannot
# receive incoming peer connections or become a superpeer.
#
# limit = <number>
@@ -194,7 +193,7 @@
# required. IP address restrictions, if any, will be checked in addition
# to the credentials specified here.
#
# When acting in the client role, rippled will supply these credentials
# When acting in the client role, xahaud will supply these credentials
# using HTTP's Basic Authentication headers when making outbound HTTP/S
# requests.
#
@@ -237,7 +236,7 @@
# WS, or WSS protocol interfaces. If administrative commands are
# disabled for a port, these credentials have no effect.
#
# When acting in the client role, rippled will supply these credentials
# When acting in the client role, xahaud will supply these credentials
# in the submitted JSON for any administrative command requests when
# invoking JSON-RPC commands on remote servers.
#
@@ -258,7 +257,7 @@
# resource controls will default to those for non-administrative users.
#
# The secure_gateway IP addresses are intended to represent
# proxies. Since rippled trusts these hosts, they must be
# proxies. Since xahaud trusts these hosts, they must be
# responsible for properly authenticating the remote user.
#
# If some IP addresses are included for both "admin" and
@@ -272,7 +271,7 @@
# Use the specified files when configuring SSL on the port.
#
# NOTE If no files are specified and secure protocols are selected,
# rippled will generate an internal self-signed certificate.
# xahaud will generate an internal self-signed certificate.
#
# The files have these meanings:
#
@@ -283,26 +282,24 @@
# ssl_cert
#
# Specifies the path to the SSL certificate file in PEM format.
# This is not needed if the chain includes it. Use ssl_chain if
# your certificate includes one or more intermediates.
# This is not needed if the chain includes it.
#
# ssl_chain
#
# If you need a certificate chain, specify the path to the
# certificate chain here. The chain may include the end certificate.
# This must be used if the certificate includes intermediates.
#
# ssl_ciphers = <cipherlist>
#
# Control the ciphers which the server will support over SSL on the port,
# specified using the OpenSSL "cipher list format".
#
# NOTE If unspecified, rippled will automatically configure a modern
# NOTE If unspecified, xahaud will automatically configure a modern
# cipher suite. This default suite should be widely supported.
#
# You should not modify this string unless you have a specific
# reason and cryptographic expertise. Incorrect modification may
# keep rippled from connecting to other instances of rippled or
# keep xahaud from connecting to other instances of xahaud or
# prevent RPC and WebSocket clients from connecting.
#
# send_queue_limit = [1..65535]
@@ -353,7 +350,7 @@
#
# Examples:
# { "command" : "server_info" }
# { "command" : "log_level", "partition" : "ripplecalc", "severity" : "trace" }
# { "command" : "log_level", "partition" : "xahaudcalc", "severity" : "trace" }
#
#
#
@@ -382,31 +379,15 @@
#-----------------
#
# These settings control security and access attributes of the Peer to Peer
# server section of the rippled process. Peer Protocol implements the
# Ripple Payment protocol. It is over peer connections that transactions
# and validations are passed from to machine to machine, to determine the
# contents of validated ledgers.
#
#
#
# [compression]
#
# true or false
#
# true - enables compression
# false - disables compression [default].
#
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
# at a cost of greater CPU usage. If you enable link compression,
# the server automatically compresses communications with peer servers
# that also have link compression enabled.
# https://xrpl.org/enable-link-compression.html
# server section of the xahaud process. It is over peer connections that
# transactions and validations are passed from to machine to machine, to
# determine the contents of validated ledgers.
#
#
#
# [ips]
#
# List of hostnames or ips where the Ripple protocol is served. A default
# List of hostnames or ips where the XRPL protocol is served. A default
# starter list is included in the code and used if no other hostnames are
# available.
#
@@ -415,24 +396,23 @@
# does not generally matter.
#
# The default list of entries is:
# - r.ripple.com 51235
# - zaphod.alloy.ee 51235
# - sahyadri.isrdc.in 51235
# - hubs.xahau.as16089.net 21337
# - bacab.alloy.ee 21337
#
# Examples:
#
# [ips]
# 192.168.0.1
# 192.168.0.1 2459
# r.ripple.com 51235
# 192.168.0.1 21337
# bacab.alloy.ee 21337
#
#
# [ips_fixed]
#
# List of IP addresses or hostnames to which rippled should always attempt to
# List of IP addresses or hostnames to which xahaud should always attempt to
# maintain peer connections with. This is useful for manually forming private
# networks, for example to configure a validation server that connects to the
# Ripple network through a public-facing server, or for building a set
# Xahau Network through a public-facing server, or for building a set
# of cluster peers.
#
# One address or domain names per line is allowed. A port must be specified
@@ -478,6 +458,19 @@
#
#
#
# [sntp_servers]
#
# IP address or domain of NTP servers to use for time synchronization.
#
# These NTP servers are suitable for xahaud servers located in the United
# States:
# time.windows.com
# time.apple.com
# time.nist.gov
# pool.ntp.org
#
#
#
# [max_transactions]
#
# Configure the maximum number of transactions to have in the job queue
@@ -570,7 +563,7 @@
#
# minimum_txn_in_ledger_standalone = <number>
#
# Like minimum_txn_in_ledger when rippled is running in standalone
# Like minimum_txn_in_ledger when xahaud is running in standalone
# mode. Default: 1000.
#
# target_txn_in_ledger = <number>
@@ -707,7 +700,7 @@
#
# [validator_token]
#
# This is an alternative to [validation_seed] that allows rippled to perform
# This is an alternative to [validation_seed] that allows xahaud to perform
# validation without having to store the validator keys on the network
# connected server. The field should contain a single token in the form of a
# base64-encoded blob.
@@ -742,19 +735,18 @@
#
# Specify the file by its name or path.
# Unless an absolute path is specified, it will be considered relative to
# the folder in which the rippled.cfg file is located.
# the folder in which the xahaud.cfg file is located.
#
# Examples:
# /home/ripple/validators.txt
# C:/home/ripple/validators.txt
# /home/xahaud/validators.txt
# C:/home/xahaud/validators.txt
#
# Example content:
# [validators]
# n949f75evCHwgyP4fPVgaHqNHxUVN15PsJEZ3B3HnXPcPjcZAoy7
# n9MD5h24qrQqiyBC8aeqqCWvpiBiYQ3jxSr91uiDvmrkyHRdYLUj
# n9L81uNCaPgtUJfaHh89gmdvXKAmSt5Gdsw2g1iPWaPkAHW5Nm4C
# n9KiYM9CgngLvtRCQHZwgC2gjpdaZcCcbt3VboxiNFcKuwFVujzS
# n9LdgEtkmGB9E2h3K4Vp7iGUaKuq23Zr32ehxiU8FWY7xoxbWTSA
# n9L3GdotB8a3AqtsvS7NXt4BUTQSAYyJUr9xtFj2qXJjfbZsawKY
# n9LQDHLWyFuAn5BXJuW2ow5J9uGqpmSjRYS2cFRpxf6uJbxwDzvM
# n9MCWyKVUkiatXVJTKUrAESB5kBFP8R3hm43jGHtg8WBnjv3iDfb
# n9KWXCLRhjpajuZtULTXsy6R5xbisA6ozGxM4zdEJFq6uHiFZDvW
#
#
#
@@ -837,7 +829,7 @@
#
# 0: Disable the ledger replay feature [default]
# 1: Enable the ledger replay feature. With this feature enabled, when
# acquiring a ledger from the network, a rippled node only downloads
# acquiring a ledger from the network, a xahaud node only downloads
# the ledger header and the transactions instead of the whole ledger.
# And the ledger is built by applying the transactions to the parent
# ledger.
@@ -848,10 +840,9 @@
#
#----------------
#
# The rippled server instance uses HTTPS GET requests in a variety of
# The xahaud server instance uses HTTPS GET requests in a variety of
# circumstances, including but not limited to contacting trusted domains to
# fetch information such as mapping an email address to a Ripple Payment
# Network address.
# fetch information such as mapping an email address to a user's r address.
#
# [ssl_verify]
#
@@ -884,14 +875,133 @@
#
#-------------------------------------------------------------------------------
#
# 5. Reporting Mode
#
#------------
#
# xahaud has an optional operating mode called Reporting Mode. In Reporting
# Mode, xahaud does not connect to the peer to peer network. Instead, xahaud
# will continuously extract data from one or more xahaud servers that are
# connected to the peer to peer network (referred to as an ETL source).
# Reporting mode servers will forward RPC requests that require access to the
# peer to peer network (submit, fee, etc) to an ETL source.
#
# [reporting] Settings for Reporting Mode. If and only if this section is
# present, xahaud will start in reporting mode. This section
# contains a list of ETL source names, and key-value pairs. The
# ETL source names each correspond to a configuration file
# section; the names must match exactly. The key-value pairs are
# optional.
#
#
# [<name>]
#
# A series of key/value pairs that specify an ETL source.
#
# source_ip = <IP-address>
#
# Required. IP address of the ETL source. Can also be a DNS record.
#
# source_ws_port = <number>
#
# Required. Port on which ETL source is accepting unencrypted websocket
# connections.
#
# source_grpc_port = <number>
#
# Required for ETL. Port on which ETL source is accepting gRPC requests.
# If this option is ommitted, this ETL source cannot actually be used for
# ETL; the Reporting Mode server can still forward RPCs to this ETL
# source, but cannot extract data from this ETL source.
#
#
# Key-value pairs (all optional):
#
# read_only Valid values: 0, 1. Default is 0. If set to 1, the server
# will start in strict read-only mode, and will not perform
# ETL. The server will still handle RPC requests, and will
# still forward RPC requests that require access to the p2p
# network.
#
# start_sequence
# Sequence of first ledger to extract if the database is empty.
# ETL extracts ledgers in order. If this setting is absent and
# the database is empty, ETL will start with the next ledger
# validated by the network. If this setting is present and the
# database is not empty, an exception is thrown.
#
# num_markers Degree of parallelism used during the initial ledger
# download. Only used if the database is empty. Valid values
# are 1-256. A higher degree of parallelism results in a
# faster download, but puts more load on the ETL source.
# Default is 2.
#
# Example:
#
# [reporting]
# etl_source1
# etl_source2
# read_only=0
# start_sequence=32570
# num_markers=8
#
# [etl_source1]
# source_ip=1.2.3.4
# source_ws_port=6005
# source_grpc_port=50051
#
# [etl_source2]
# source_ip=5.6.7.8
# source_ws_port=6005
# source_grpc_port=50051
#
# Minimal Example:
#
# [reporting]
# etl_source1
#
# [etl_source1]
# source_ip=1.2.3.4
# source_ws_port=6005
# source_grpc_port=50051
#
#
# Notes:
#
# Reporting Mode requires Postgres (instead of SQLite). The Postgres
# connection info is specified under the [ledger_tx_tables] config section;
# see the Database section for further documentation.
#
# Each ETL source specified must have gRPC enabled (by adding a [port_grpc]
# section to the config). It is recommended to add a secure_gateway entry to
# the gRPC section, in order to bypass the server's rate limiting.
# This section needs to be added to the config of the ETL source, not
# the config of the reporting node. In the example below, the
# reporting server is running at 127.0.0.1. Multiple IPs can be
# specified in secure_gateway via a comma separated list.
#
# [port_grpc]
# ip = 0.0.0.0
# port = 50051
# secure_gateway = 127.0.0.1
#
#
#-------------------------------------------------------------------------------
#
# 6. Database
#
#------------
#
# rippled creates 4 SQLite database to hold bookkeeping information
# xahaud creates 4 SQLite database to hold bookkeeping information
# about transactions, local credentials, and various other things.
# It also creates the NodeDB, which holds all the objects that
# make up the current and historical ledgers.
# make up the current and historical ledgers. In Reporting Mode, xahauad
# uses a Postgres database instead of SQLite.
#
# The simplest way to work with Postgres is to install it locally.
# When it is running, execute the initdb.sh script in the current
# directory as: sudo -u postgres ./initdb.sh
# This will create the xahaud user and an empty database of the same name.
#
# The size of the NodeDB grows in proportion to the amount of new data and the
# amount of historical data (a configurable setting) so the performance of the
@@ -899,7 +1009,7 @@
# the performance of the server.
#
# Partial pathnames will be considered relative to the location of
# the rippled.cfg file.
# the xahaud.cfg file.
#
# [node_db] Settings for the Node Database (required)
#
@@ -910,18 +1020,18 @@
#
# Example:
# type=nudb
# path=db/nudb
# path=/opt/xahaud/db/nudb
#
# The "type" field must be present and controls the choice of backend:
#
# type = NuDB
#
# NuDB is a high-performance database written by Ripple Labs and optimized
# for rippled and solid-state drives.
# for and solid-state drives.
#
# NuDB maintains its high speed regardless of the amount of history
# stored. Online delete may be selected, but is not required. NuDB is
# available on all platforms that rippled runs on.
# available on all platforms that xahaud runs on.
#
# type = RocksDB
#
@@ -933,6 +1043,14 @@
# keeping full history is not advised, and using online delete is
# recommended.
#
# type = Cassandra
#
# Apache Cassandra is an open-source, distributed key-value store - see
# https://cassandra.apache.org/ for more details.
#
# Cassandra is an alternative backend to be used only with Reporting Mode.
# See the Reporting Mode section for more details about Reporting Mode.
#
# type = RWDB
#
# RWDB is a high-performance memory store written by XRPL-Labs and optimized
@@ -940,14 +1058,31 @@
# RWDB is recommended for Validator and Peer nodes that are not required to
# store history.
#
# RWDB maintains its high speed regardless of the amount of history
# stored. Online delete should NOT be used instead RWDB will use the
# ledger_history config value to determine how many ledgers to keep in memory.
#
# Required keys for NuDB, RWDB and RocksDB:
# Required keys for NuDB and RocksDB:
#
# path Location to store the database
#
# Required keys for RWDB:
#
# online_delete Required. RWDB stores data in memory and will
# grow unbounded without online_delete. See the
# online_delete section below.
#
# Required keys for Cassandra:
#
# contact_points IP of a node in the Cassandra cluster
#
# port CQL Native Transport Port
#
# secure_connect_bundle
# Absolute path to a secure connect bundle. When using
# a secure connect bundle, contact_points and port are
# not required.
#
# keyspace Name of Cassandra keyspace to use
#
# table_name Name of table in above keyspace to use
#
# Optional keys
#
# cache_size Size of cache for database records. Default is 16384.
@@ -964,7 +1099,7 @@
# default value for the unspecified parameter.
#
# Note: the cache will not be created if online_delete
# is specified.
# is specified, or if shards are used.
#
# fast_load Boolean. If set, load the last persisted ledger
# from disk upon process start before syncing to
@@ -972,18 +1107,60 @@
# if sufficient IOPS capacity is available.
# Default 0.
#
# Optional keys for NuDB or RocksDB:
#
# earliest_seq The default is 32570 to match the XRP ledger
# network's earliest allowed sequence. Alternate
# networks may set this value. Minimum value of 1.
# online_delete for RWDB, NuDB and RocksDB:
#
# online_delete Minimum value of 256. Enable automatic purging
# of older ledger information. Maintain at least this
# number of ledger records online. Must be greater
# than or equal to ledger_history. If using RWDB
# this value is ignored.
# than or equal to ledger_history.
#
# REQUIRED for RWDB to prevent out-of-memory errors.
# Optional for NuDB and RocksDB.
#
# Optional keys for NuDB and RocksDB:
#
# earliest_seq The default is 32570 to match the XRP Ledger's
# network's earliest allowed sequence. Alternate
# networks may set this value. Minimum value of 1.
# If a [shard_db] section is defined, and this
# value is present either [node_db] or [shard_db],
# it must be defined with the same value in both
# sections.
#
# Optional keys for NuDB only:
#
# nudb_block_size EXPERIMENTAL: Block size in bytes for NuDB storage.
# Must be a power of 2 between 4096 and 32768. Default is 4096.
#
# This parameter controls the fundamental storage unit
# size for NuDB's internal data structures. The choice
# of block size can significantly impact performance
# depending on your storage hardware and filesystem:
#
# - 4096 bytes: Optimal for most standard SSDs and
# traditional filesystems (ext4, NTFS, HFS+).
# Provides good balance of performance and storage
# efficiency. Recommended for most deployments.
#
# - 8192-16384 bytes: May improve performance on
# high-end NVMe SSDs and copy-on-write filesystems
# like ZFS or Btrfs that benefit from larger block
# alignment. Can reduce metadata overhead for large
# databases.
#
# - 32768 bytes (32K): Maximum supported block size
# for high-performance scenarios with very fast
# storage. May increase memory usage and reduce
# efficiency for smaller databases.
#
# Note: This setting cannot be changed after database
# creation without rebuilding the entire database.
# Choose carefully based on your hardware and expected
# database size.
#
# Example: nudb_block_size=4096
#
# These keys modify the behavior of online_delete, and thus are only
# relevant if online_delete is defined and non-zero:
#
@@ -1017,13 +1194,32 @@
#
# recovery_wait_seconds
# The online delete process checks periodically
# that rippled is still in sync with the network,
# that xahaud is still in sync with the network,
# and that the validated ledger is less than
# 'age_threshold_seconds' old. If not, then continue
# sleeping for this number of seconds and
# checking until healthy.
# Default is 5.
#
# Optional keys for Cassandra:
#
# username Username to use if Cassandra cluster requires
# authentication
#
# password Password to use if Cassandra cluster requires
# authentication
#
# max_requests_outstanding
# Limits the maximum number of concurrent database
# writes. Default is 10 million. For slower clusters,
# large numbers of concurrent writes can overload the
# cluster. Setting this option can help eliminate
# write timeouts and other write errors due to the
# cluster being overloaded.
# io_threads
# Set the number of IO threads used by the
# Cassandra driver. Defaults to 4.
#
# Notes:
# The 'node_db' entry configures the primary, persistent storage.
#
@@ -1037,8 +1233,34 @@
# The server creates and maintains 4 to 5 bookkeeping SQLite databases in
# the 'database_path' location. If you omit this configuration setting,
# the server creates a directory called "db" located in the same place as
# your rippled.cfg file.
# Partial pathnames are relative to the location of the rippled executable.
# your xahaud.cfg file.
# Partial pathnames are relative to the location of the xahaud executable.
#
# [shard_db] Settings for the Shard Database (optional)
#
# Format (without spaces):
# One or more lines of case-insensitive key / value pairs:
# <key> '=' <value>
# ...
#
# Example:
# path=db/shards/nudb
#
# Required keys:
# path Location to store the database
#
# Optional keys:
# max_historical_shards
# The maximum number of historical shards
# to store.
#
# [historical_shard_paths] Additional storage paths for the Shard Database (optional)
#
# Format (without spaces):
# One or more lines, each expressing a full path for storing historical shards:
# /mnt/disk1
# /mnt/disk2
# ...
#
# [sqlite] Tuning settings for the SQLite databases (optional)
#
@@ -1088,7 +1310,7 @@
# The default is "wal", which uses a write-ahead
# log to implement database transactions.
# Alternately, "memory" saves disk I/O, but if
# rippled crashes during a transaction, the
# xahaud crashes during a transaction, the
# database is likely to be corrupted.
# See https://www.sqlite.org/pragma.html#pragma_journal_mode
# for more details about the available options.
@@ -1098,7 +1320,7 @@
# synchronous Valid values: off, normal, full, extra
# The default is "normal", which works well with
# the "wal" journal mode. Alternatively, "off"
# allows rippled to continue as soon as data is
# allows xahaud to continue as soon as data is
# passed to the OS, which can significantly
# increase speed, but risks data corruption if
# the host computer crashes before writing that
@@ -1112,25 +1334,47 @@
# The default is "file", which will use files
# for temporary database tables and indices.
# Alternatively, "memory" may save I/O, but
# rippled does not currently use many, if any,
# xahaud does not currently use many, if any,
# of these temporary objects.
# See https://www.sqlite.org/pragma.html#pragma_temp_store
# for more details about the available options.
# This setting may not be combined with the
# "safety_level" setting.
#
# page_size Valid values: integer (MUST be power of 2 between 512 and 65536)
# The default is 4096 bytes. This setting determines
# the size of a page in the transaction.db file.
# See https://www.sqlite.org/pragma.html#pragma_page_size
# for more details about the available options.
# [ledger_tx_tables] (optional)
#
# journal_size_limit Valid values: integer
# The default is 1582080. This setting limits
# the size of the journal for transaction.db file. When the limit is
# reached, older entries will be deleted.
# See https://www.sqlite.org/pragma.html#pragma_journal_size_limit
# for more details about the available options.
# conninfo Info for connecting to Postgres. Format is
# postgres://[username]:[password]@[ip]/[database].
# The database and user must already exist. If this
# section is missing and xahaud is running in
# Reporting Mode, xahaud will connect as the
# user running xahaud to a database with the
# same name. On Linux and Mac OS X, the connection
# will take place using the server's UNIX domain
# socket. On Windows, through the localhost IP
# address. Default is empty.
#
# use_tx_tables Valid values: 1, 0
# The default is 1 (true). Determines whether to use
# the SQLite transaction database. If set to 0,
# xahaud will not write to the transaction database,
# and will reject tx, account_tx and tx_history RPCs.
# In Reporting Mode, this setting is ignored.
#
# max_connections Valid values: any positive integer up to 64 bit
# storage length. This configures the maximum
# number of concurrent connections to postgres.
# Default is the maximum possible value to
# fit in a 64 bit integer.
#
# timeout Number of seconds after which idle postgres
# connections are discconnected. If set to 0,
# connections never timeout. Default is 600.
#
#
# remember_ip Value values: 1, 0
# Default is 1 (true). Whether to cache host and
# port connection settings.
#
#
#-------------------------------------------------------------------------------
@@ -1141,7 +1385,7 @@
#
# These settings are designed to help server administrators diagnose
# problems, and obtain detailed information about the activities being
# performed by the rippled process.
# performed by the xahaud process.
#
#
#
@@ -1158,7 +1402,7 @@
#
# Configuration parameters for the Beast. Insight stats collection module.
#
# Insight is a module that collects information from the areas of rippled
# Insight is a module that collects information from the areas of xahaud
# that have instrumentation. The configuration parameters control where the
# collection metrics are sent. The parameters are expressed as key = value
# pairs with no white space. The main parameter is the choice of server:
@@ -1167,7 +1411,7 @@
#
# Choice of server to send metrics to. Currently the only choice is
# "statsd" which sends UDP packets to a StatsD daemon, which must be
# running while rippled is running. More information on StatsD is
# running while xahaud is running. More information on StatsD is
# available here:
# https://github.com/b/statsd_spec
#
@@ -1177,7 +1421,7 @@
# in the format, n.n.n.n:port.
#
# "prefix" A string prepended to each collected metric. This is used
# to distinguish between different running instances of rippled.
# to distinguish between different running instances of xahaud.
#
# If this section is missing, or the server type is unspecified or unknown,
# statistics are not collected or reported.
@@ -1204,7 +1448,7 @@
#
# Example:
# [perf]
# perf_log=/var/log/rippled/perf.log
# perf_log=/var/log/xahaud/perf.log
# log_interval=2
#
#-------------------------------------------------------------------------------
@@ -1213,8 +1457,8 @@
#
#----------
#
# The vote settings configure settings for the entire Ripple network.
# While a single instance of rippled cannot unilaterally enforce network-wide
# The vote settings configure settings for the entire Xahau Network.
# While a single instance of xahaud cannot unilaterally enforce network-wide
# settings, these choices become part of the instance's vote during the
# consensus process for each voting ledger.
#
@@ -1226,9 +1470,9 @@
#
# The cost of the reference transaction fee, specified in drops.
# The reference transaction is the simplest form of transaction.
# It represents an XRP payment between two parties.
# It represents an XAH payment between two parties.
#
# If this parameter is unspecified, rippled will use an internal
# If this parameter is unspecified, xahaud will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
@@ -1237,26 +1481,26 @@
# account_reserve = <drops>
#
# The account reserve requirement is specified in drops. The portion of an
# account's XRP balance that is at or below the reserve may only be
# account's XAH balance that is at or below the reserve may only be
# spent on transaction fees, and not transferred out of the account.
#
# If this parameter is unspecified, rippled will use an internal
# If this parameter is unspecified, xahaud will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# account_reserve = 10000000 # 10 XRP
# account_reserve = 10000000 # 10 XAH
#
# owner_reserve = <drops>
#
# The owner reserve is the amount of XRP reserved in the account for
# The owner reserve is the amount of XAH reserved in the account for
# each ledger item owned by the account. Ledger items an account may
# own include trust lines, open orders, and tickets.
#
# If this parameter is unspecified, rippled will use an internal
# If this parameter is unspecified, xahaud will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# owner_reserve = 2000000 # 2 XRP
# owner_reserve = 2000000 # 2 XAH
#
#-------------------------------------------------------------------------------
#
@@ -1294,7 +1538,7 @@
# tool instead.
#
# This flag has no effect on the "sign" and "sign_for" command line options
# that rippled makes available.
# that xahaud makes available.
#
# The default value of this field is "false"
#
@@ -1373,7 +1617,7 @@
#--------------------
#
# Administrators can use these values as a starting point for configuring
# their instance of rippled, but each value should be checked to make sure
# their instance of xahaud, but each value should be checked to make sure
# it meets the business requirements for the organization.
#
# Server
@@ -1383,7 +1627,7 @@
# "peer"
#
# Peer protocol open to everyone. This is required to accept
# incoming rippled connections. This does not affect automatic
# incoming xahaud connections. This does not affect automatic
# or manual outgoing Peer protocol connections.
#
# "rpc"
@@ -1396,12 +1640,6 @@
# Admin level API commands over Secure Websockets, when originating
# from the same machine (via the loopback adapter at 127.0.0.1).
#
# "grpc"
#
# ETL commands for Clio. We recommend setting secure_gateway
# in this section to a comma-separated list of the addresses
# of your Clio servers, in order to bypass rippled's rate limiting.
#
# This port is commented out but can be enabled by removing
# the '#' from each corresponding line including the entry under [server]
#
@@ -1417,8 +1655,8 @@
# NOTE
#
# To accept connections on well known ports such as 80 (HTTP) or
# 443 (HTTPS), most operating systems will require rippled to
# run with administrator privileges, or else rippled will not start.
# 443 (HTTPS), most operating systems will require xahaud to
# run with administrator privileges, or else xahaud will not start.
[server]
port_rpc_admin_local
@@ -1429,24 +1667,23 @@ port_ws_admin_local
#ssl_cert = /etc/ssl/certs/server.crt
[port_rpc_admin_local]
port = 5005
port = 5009
ip = 127.0.0.1
admin = 127.0.0.1
protocol = http
[port_peer]
port = 51235
port = 21337
ip = 0.0.0.0
# alternatively, to accept connections on IPv4 + IPv6, use:
#ip = ::
protocol = peer
[port_ws_admin_local]
port = 6006
port = 6009
ip = 127.0.0.1
admin = 127.0.0.1
protocol = ws
send_queue_limit = 500
[port_grpc]
port = 50051
@@ -1454,16 +1691,15 @@ ip = 127.0.0.1
secure_gateway = 127.0.0.1
#[port_ws_public]
#port = 6005
#port = 6008
#ip = 127.0.0.1
#protocol = wss
#send_queue_limit = 500
#-------------------------------------------------------------------------------
# This is primary persistent datastore for rippled. This includes transaction
# This is primary persistent datastore for xahaud. This includes transaction
# metadata, account states, and ledger headers. Helpful information can be
# found at https://xrpl.org/capacity-planning.html#node-db-type
# found at https://xahau.network/docs/infrastructure/system-requirements
# type=NuDB is recommended for non-validators with fast SSDs. Validators or
# slow / spinning disks should use RocksDB. Caution: Spinning disks are
# not recommended. They do not perform well enough to consistently remain
@@ -1474,32 +1710,59 @@ secure_gateway = 127.0.0.1
# when the node has approximately two times the "online_delete" value of
# ledgers. No external administrative command is required to initiate
# deletion.
[ledger_history]
256
[node_db]
type=NuDB
path=/var/lib/rippled/db/nudb
online_delete=512
path=/opt/xahaud/db/nudb
online_delete=256
advisory_delete=0
[database_path]
/var/lib/rippled/db
/opt/xahaud/db
# To use Postgres, uncomment this section and fill in the appropriate connection
# info. Postgres can only be used in Reporting Mode.
# To disable writing to the transaction database, uncomment this section, and
# set use_tx_tables=0
# [ledger_tx_tables]
# conninfo = postgres://[username]:[password]@[ip]/[database]
# use_tx_tables=1
# This needs to be an absolute directory reference, not a relative one.
# Modify this value as required.
[debug_logfile]
/var/log/rippled/debug.log
/var/log/xahaud/debug.log
# To use the XRP test network
# (see https://xrpl.org/connect-your-rippled-to-the-xrp-test-net.html),
[sntp_servers]
time.windows.com
time.apple.com
time.nist.gov
pool.ntp.org
# Use the following [ips] section for the main network:
[ips]
bacab.alloy.ee 21337
hubs.xahau.as16089.net 21337
# To use the Xahau Test Network
# (see https://xahau.network/docs/infrastructure/installing-xahaud),
# use the following [ips] section:
# [ips]
# r.altnet.rippletest.net 51235
# 79.110.60.121 21338
# 79.110.60.122 21338
# 79.110.60.124 21338
# 79.110.60.125 21338
# File containing trusted validator keys or validator list publishers.
# Unless an absolute path is specified, it will be considered relative to the
# folder in which the rippled.cfg file is located.
# folder in which the xahaud.cfg file is located.
[validators_file]
validators.txt
validators-xahau.txt
# Turn down default logging to save disk space in the long run.
# Valid values here are trace, debug, info, warning, error, and fatal
@@ -1511,3 +1774,22 @@ validators.txt
# set to ssl_verify to 0.
[ssl_verify]
1
# Define which network xahaud is connecting to
# 21337 for the Main Xahau Network
# 21338 for the Test Xahau Network
[network_id]
21337
# 21338
# To run in Reporting Mode, uncomment this section and fill in the appropriate
# connection info for one or more ETL sources.
# [reporting]
# etl_source
#
#
# [etl_source]
# source_grpc_port=50051
# source_ws_port=6005
# source_ip=127.0.0.1

View File

@@ -1,4 +1,4 @@
# standalone: ./rippled -a --ledgerfile config/genesis.json --conf config/rippled-standalone.cfg
# standalone: ./xahaud -a --ledgerfile config/genesis.json --conf config/xahaud-standalone.cfg
[server]
port_rpc_admin_local
port_ws_public
@@ -21,7 +21,7 @@ ip = 0.0.0.0
protocol = ws
# [port_peer]
# port = 51235
# port = 21337
# ip = 0.0.0.0
# protocol = peer
@@ -69,7 +69,8 @@ time.nist.gov
pool.ntp.org
[ips]
r.ripple.com 51235
bacab.alloy.ee 21337
hubs.xahau.as16089.net 21337
[validators_file]
validators-example.txt
@@ -94,7 +95,7 @@ validators-example.txt
1000000
[network_id]
21338
21337
[amendments]
740352F2412A9909880C23A559FCECEDA3BE2126FED62FC7660D628A06927F11 Flow

View File

@@ -17,7 +17,9 @@ target_compile_features (common INTERFACE cxx_std_20)
target_compile_definitions (common
INTERFACE
$<$<CONFIG:Debug>:DEBUG _DEBUG>
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>)
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>
# TODO: Remove once we have migrated functions from OpenSSL 1.x to 3.x.
OPENSSL_SUPPRESS_DEPRECATED)
# ^^^^ NOTE: CMAKE release builds already have NDEBUG
# defined, so no need to add it explicitly except for
# this special case of (profile ON) and (assert OFF)
@@ -120,6 +122,7 @@ else ()
target_link_libraries (common
INTERFACE
-rdynamic
$<$<BOOL:${is_linux}>:-Wl,-z,relro,-z,now,--build-id>
# link to static libc/c++ iff:
# * static option set and
# * NOT APPLE (AppleClang does not support static libc/c++) and
@@ -130,6 +133,17 @@ else ()
>)
endif ()
# Antithesis instrumentation will only be built and deployed using machines running Linux.
if (voidstar)
if (NOT CMAKE_BUILD_TYPE STREQUAL "Debug")
message(FATAL_ERROR "Antithesis instrumentation requires Debug build type, aborting...")
elseif (NOT is_linux)
message(FATAL_ERROR "Antithesis instrumentation requires Linux, aborting...")
elseif (NOT (is_clang AND CMAKE_CXX_COMPILER_VERSION VERSION_GREATER_EQUAL 16.0))
message(FATAL_ERROR "Antithesis instrumentation requires Clang version 16 or later, aborting...")
endif ()
endif ()
if (use_mold)
# use mold linker if available
execute_process (
@@ -137,6 +151,16 @@ if (use_mold)
ERROR_QUIET OUTPUT_VARIABLE LD_VERSION)
if ("${LD_VERSION}" MATCHES "mold")
target_link_libraries (common INTERFACE -fuse-ld=mold)
else ()
# Checking for mold linker (< GCC 12.1.0)
execute_process (
COMMAND ${CMAKE_CXX_COMPILER} -B/usr/libexec/mold -Wl,--version
OUTPUT_VARIABLE LD_VERSION_OUT
ERROR_VARIABLE LD_VERSION_ERR)
set(LD_VERSION "${LD_VERSION_OUT}${LD_VERSION_ERR}")
if ("${LD_VERSION}" MATCHES "mold")
target_link_libraries (common INTERFACE -B/usr/libexec/mold)
endif ()
endif ()
unset (LD_VERSION)
elseif (use_gold AND is_gcc)

View File

@@ -9,6 +9,7 @@ include(target_protobuf_sources)
# define a bunch of `static const` variables with the same names,
# so we just build them as a separate library.
add_library(xrpl.libpb)
set_target_properties(xrpl.libpb PROPERTIES UNITY_BUILD OFF)
target_protobuf_sources(xrpl.libpb xrpl/proto
LANGUAGE cpp
IMPORT_DIRS include/xrpl/proto
@@ -47,11 +48,74 @@ target_link_libraries(xrpl.libpb
gRPC::grpc++
)
# TODO: Clean up the number of library targets later.
add_library(xrpl.imports.main INTERFACE)
target_link_libraries(xrpl.imports.main
INTERFACE
LibArchive::LibArchive
OpenSSL::Crypto
Ripple::boost
wasmedge::wasmedge
Ripple::opts
Ripple::syslibs
absl::random_random
date::date
ed25519::ed25519
secp256k1::secp256k1
xrpl.libpb
xxHash::xxhash
$<$<BOOL:${voidstar}>:antithesis-sdk-cpp>
)
include(add_module)
include(target_link_modules)
# Level 01
add_module(xrpl beast)
target_link_libraries(xrpl.libxrpl.beast PUBLIC
xrpl.imports.main
xrpl.libpb
)
# Conditionally add enhanced logging source when BEAST_ENHANCED_LOGGING is enabled
if(DEFINED BEAST_ENHANCED_LOGGING AND BEAST_ENHANCED_LOGGING)
target_sources(xrpl.libxrpl.beast PRIVATE
src/libxrpl/beast/utility/src/beast_EnhancedLogging.cpp)
endif()
# Level 02
add_module(xrpl basics)
target_link_libraries(xrpl.libxrpl.basics PUBLIC xrpl.libxrpl.beast)
# Level 03
add_module(xrpl json)
target_link_libraries(xrpl.libxrpl.json PUBLIC xrpl.libxrpl.basics)
add_module(xrpl crypto)
target_link_libraries(xrpl.libxrpl.crypto PUBLIC xrpl.libxrpl.basics)
add_module(xrpl hook)
target_link_libraries(xrpl.libxrpl.hook PUBLIC xrpl.libxrpl.basics)
# Level 04
add_module(xrpl protocol)
target_link_libraries(xrpl.libxrpl.protocol PUBLIC
xrpl.libxrpl.crypto
xrpl.libxrpl.hook
xrpl.libxrpl.json
)
# Level 05
add_module(xrpl resource)
target_link_libraries(xrpl.libxrpl.resource PUBLIC xrpl.libxrpl.protocol)
add_module(xrpl server)
target_link_libraries(xrpl.libxrpl.server PUBLIC xrpl.libxrpl.protocol)
add_library(xrpl.libxrpl)
set_target_properties(xrpl.libxrpl PROPERTIES OUTPUT_NAME xrpl)
if(unity)
set_target_properties(xrpl.libxrpl PROPERTIES UNITY_BUILD ON)
endif()
# Try to find the ACL library
find_library(ACL_LIBRARY NAMES acl)
@@ -70,43 +134,28 @@ file(GLOB_RECURSE sources CONFIGURE_DEPENDS
)
target_sources(xrpl.libxrpl PRIVATE ${sources})
target_include_directories(xrpl.libxrpl
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>
$<INSTALL_INTERFACE:include>)
target_compile_definitions(xrpl.libxrpl
PUBLIC
BOOST_ASIO_USE_TS_EXECUTOR_AS_DEFAULT
BOOST_CONTAINER_FWD_BAD_DEQUE
HAS_UNCAUGHT_EXCEPTIONS=1)
target_compile_options(xrpl.libxrpl
PUBLIC
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
target_link_modules(xrpl PUBLIC
basics
beast
crypto
hook
json
protocol
resource
server
)
target_link_libraries(xrpl.libxrpl
PUBLIC
LibArchive::LibArchive
OpenSSL::Crypto
Ripple::boost
wasmedge::wasmedge
Ripple::opts
Ripple::syslibs
absl::random_random
date::date
ed25519::ed25519
secp256k1::secp256k1
xrpl.libpb
xxHash::xxhash
)
# All headers in libxrpl are in modules.
# Uncomment this stanza if you have not yet moved new headers into a module.
# target_include_directories(xrpl.libxrpl
# PRIVATE
# $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
# PUBLIC
# $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>
# $<INSTALL_INTERFACE:include>)
if(xrpld)
add_executable(rippled)
if(unity)
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
endif()
if(tests)
target_compile_definitions(rippled PUBLIC ENABLE_TESTS)
endif()
@@ -132,6 +181,11 @@ if(xrpld)
Ripple::opts
Ripple::libs
xrpl.libxrpl
# Workaround for a Conan 1.x bug that prevents static linking of libstdc++
# when a dependency (snappy) modifies system_libs. See the comment in
# external/snappy/conanfile.py for a full explanation.
# This is likely not strictly necessary, but listed explicitly as a good practice.
m
)
exclude_if_included(rippled)
# define a macro for tests that might need to
@@ -140,6 +194,19 @@ if(xrpld)
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
endif ()
if(voidstar)
target_compile_options(rippled
PRIVATE
-fsanitize-coverage=trace-pc-guard
)
# rippled requires access to antithesis-sdk-cpp implementation file
# antithesis_instrumentation.h, which is not exported as INTERFACE
target_include_directories(rippled
PRIVATE
${CMAKE_SOURCE_DIR}/external/antithesis-sdk
)
endif()
# any files that don't play well with unity should be added here
if(tests)
set_source_files_properties(

View File

@@ -2,14 +2,26 @@
install stuff
#]===================================================================]
include(create_symbolic_link)
install (
TARGETS
common
opts
ripple_syslibs
ripple_boost
xrpl.imports.main
xrpl.libpb
xrpl.libxrpl.basics
xrpl.libxrpl.beast
xrpl.libxrpl.hook
xrpl.libxrpl.crypto
xrpl.libxrpl.json
xrpl.libxrpl.protocol
xrpl.libxrpl.resource
xrpl.libxrpl.server
xrpl.libxrpl
antithesis-sdk-cpp
EXPORT RippleExports
LIBRARY DESTINATION lib
ARCHIVE DESTINATION lib
@@ -21,12 +33,12 @@ install(
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}"
)
if(NOT WIN32)
install(
CODE "file(CREATE_LINK xrpl \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple SYMBOLIC)"
)
endif()
install(CODE "
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
include(create_symbolic_link)
create_symbolic_link(xrpl \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple)
")
install (EXPORT RippleExports
FILE RippleTargets.cmake
@@ -55,12 +67,12 @@ if (is_root_project AND TARGET rippled)
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/rippled-example.cfg\" etc rippled.cfg)
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/validators-example.txt\" etc validators.txt)
")
if(NOT WIN32)
install(
CODE "file(CREATE_LINK rippled${suffix} \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix} SYMBOLIC)"
)
endif()
install(CODE "
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
include(create_symbolic_link)
create_symbolic_link(rippled${suffix} \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix})
")
endif ()
install (

View File

@@ -7,6 +7,9 @@ add_library (Ripple::opts ALIAS opts)
target_compile_definitions (opts
INTERFACE
BOOST_ASIO_DISABLE_HANDLER_TYPE_REQUIREMENTS
BOOST_ASIO_USE_TS_EXECUTOR_AS_DEFAULT
BOOST_CONTAINER_FWD_BAD_DEQUE
HAS_UNCAUGHT_EXCEPTIONS=1
$<$<BOOL:${boost_show_deprecated}>:
BOOST_ASIO_NO_DEPRECATED
BOOST_FILESYSTEM_NO_DEPRECATED
@@ -18,10 +21,12 @@ target_compile_definitions (opts
>
$<$<BOOL:${beast_no_unit_test_inline}>:BEAST_NO_UNIT_TEST_INLINE=1>
$<$<BOOL:${beast_disable_autolink}>:BEAST_DONT_AUTOLINK_TO_WIN32_LIBRARIES=1>
$<$<BOOL:${single_io_service_thread}>:RIPPLE_SINGLE_IO_SERVICE_THREAD=1>)
$<$<BOOL:${single_io_service_thread}>:RIPPLE_SINGLE_IO_SERVICE_THREAD=1>
$<$<BOOL:${voidstar}>:ENABLE_VOIDSTAR>)
target_compile_options (opts
INTERFACE
$<$<AND:$<BOOL:${is_gcc}>,$<COMPILE_LANGUAGE:CXX>>:-Wsuggest-override>
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
$<$<BOOL:${perf}>:-fno-omit-frame-pointer>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>

View File

@@ -1,33 +0,0 @@
#[===================================================================[
NIH prefix path..this is where we will download
and build any ExternalProjects, and they will hopefully
survive across build directory deletion (manual cleans)
#]===================================================================]
string (REGEX REPLACE "[ \\/%]+" "_" gen_for_path ${CMAKE_GENERATOR})
string (TOLOWER ${gen_for_path} gen_for_path)
# HACK: trying to shorten paths for windows CI (which hits 260 MAXPATH easily)
# @see: https://issues.jenkins-ci.org/browse/JENKINS-38706?focusedCommentId=339847
string (REPLACE "visual_studio" "vs" gen_for_path ${gen_for_path})
if (NOT DEFINED NIH_CACHE_ROOT)
if (DEFINED ENV{NIH_CACHE_ROOT})
set (NIH_CACHE_ROOT $ENV{NIH_CACHE_ROOT})
else ()
set (NIH_CACHE_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/.nih_c")
endif ()
endif ()
set (nih_cache_path
"${NIH_CACHE_ROOT}/${gen_for_path}/${CMAKE_CXX_COMPILER_ID}_${CMAKE_CXX_COMPILER_VERSION}")
if (NOT is_multiconfig)
set (nih_cache_path "${nih_cache_path}/${CMAKE_BUILD_TYPE}")
endif ()
file(TO_CMAKE_PATH "${nih_cache_path}" nih_cache_path)
message (STATUS "NIH-EP cache path: ${nih_cache_path}")
## two convenience variables:
set (ep_lib_prefix ${CMAKE_STATIC_LIBRARY_PREFIX})
set (ep_lib_suffix ${CMAKE_STATIC_LIBRARY_SUFFIX})
# this is a setting for FetchContent and needs to be
# a cache variable
# https://cmake.org/cmake/help/latest/module/FetchContent.html#populating-the-content
set (FETCHCONTENT_BASE_DIR ${nih_cache_path} CACHE STRING "" FORCE)

View File

@@ -17,6 +17,10 @@ if(unity)
if(NOT is_ci)
set(CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
endif()
set(CMAKE_UNITY_BUILD ON CACHE BOOL "Do a unity build")
endif()
if(is_clang AND is_linux)
option(voidstar "Enable Antithesis instrumentation." OFF)
endif()
if(is_gcc OR is_clang)
option(coverage "Generates coverage info." OFF)

37
cmake/add_module.cmake Normal file
View File

@@ -0,0 +1,37 @@
include(isolate_headers)
# Create an OBJECT library target named
#
# ${PROJECT_NAME}.lib${parent}.${name}
#
# with sources in src/lib${parent}/${name}
# and headers in include/${parent}/${name}
# that cannot include headers from other directories in include/
# unless they come through linked libraries.
#
# add_module(parent a)
# add_module(parent b)
# target_link_libraries(project.libparent.b PUBLIC project.libparent.a)
function(add_module parent name)
set(target ${PROJECT_NAME}.lib${parent}.${name})
add_library(${target} OBJECT)
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/src/lib${parent}/${name}/*.cpp"
)
target_sources(${target} PRIVATE ${sources})
target_include_directories(${target} PUBLIC
"$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>"
)
isolate_headers(
${target}
"${CMAKE_CURRENT_SOURCE_DIR}/include"
"${CMAKE_CURRENT_SOURCE_DIR}/include/${parent}/${name}"
PUBLIC
)
isolate_headers(
${target}
"${CMAKE_CURRENT_SOURCE_DIR}/src"
"${CMAKE_CURRENT_SOURCE_DIR}/src/lib${parent}/${name}"
PRIVATE
)
endfunction()

View File

@@ -1,54 +0,0 @@
find_package(Boost 1.83 REQUIRED
COMPONENTS
chrono
container
context
coroutine
date_time
filesystem
json
program_options
regex
system
thread
)
add_library(ripple_boost INTERFACE)
add_library(Ripple::boost ALIAS ripple_boost)
if(XCODE)
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
else()
target_include_directories(ripple_boost SYSTEM BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
endif()
target_link_libraries(ripple_boost
INTERFACE
Boost::boost
Boost::chrono
Boost::container
Boost::coroutine
Boost::date_time
Boost::filesystem
Boost::json
Boost::program_options
Boost::regex
Boost::system
Boost::iostreams
Boost::thread)
if(Boost_COMPILER)
target_link_libraries(ripple_boost INTERFACE Boost::disable_autolinking)
endif()
if(san AND is_clang)
# TODO: gcc does not support -fsanitize-blacklist...can we do something else
# for gcc ?
if(NOT Boost_INCLUDE_DIRS AND TARGET Boost::headers)
get_target_property(Boost_INCLUDE_DIRS Boost::headers INTERFACE_INCLUDE_DIRECTORIES)
endif()
message(STATUS "Adding [${Boost_INCLUDE_DIRS}] to sanitizer blacklist")
file(WRITE ${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt "src:${Boost_INCLUDE_DIRS}/*")
target_compile_options(opts
INTERFACE
# ignore boost headers for sanitizing
-fsanitize-blacklist=${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt)
endif()

View File

@@ -0,0 +1,20 @@
# file(CREATE_SYMLINK) only works on Windows with administrator privileges.
# https://stackoverflow.com/a/61244115/618906
function(create_symbolic_link target link)
if(WIN32)
if(NOT IS_SYMLINK "${link}")
if(NOT IS_ABSOLUTE "${target}")
# Relative links work do not work on Windows.
set(target "${link}/../${target}")
endif()
file(TO_NATIVE_PATH "${target}" target)
file(TO_NATIVE_PATH "${link}" link)
execute_process(COMMAND cmd.exe /c mklink /J "${link}" "${target}")
endif()
else()
file(CREATE_LINK "${target}" "${link}" SYMBOLIC)
endif()
if(NOT IS_SYMLINK "${link}")
message(ERROR "failed to create symlink: <${link}>")
endif()
endfunction()

View File

@@ -1,52 +1,4 @@
#[===================================================================[
NIH dep: boost
#]===================================================================]
if((NOT DEFINED BOOST_ROOT) AND(DEFINED ENV{BOOST_ROOT}))
set(BOOST_ROOT $ENV{BOOST_ROOT})
endif()
if((NOT DEFINED BOOST_LIBRARYDIR) AND(DEFINED ENV{BOOST_LIBRARYDIR}))
set(BOOST_LIBRARYDIR $ENV{BOOST_LIBRARYDIR})
endif()
file(TO_CMAKE_PATH "${BOOST_ROOT}" BOOST_ROOT)
if(WIN32 OR CYGWIN)
# Workaround for MSVC having two boost versions - x86 and x64 on same PC in stage folders
if((NOT DEFINED BOOST_LIBRARYDIR) AND (DEFINED BOOST_ROOT))
if(IS_DIRECTORY ${BOOST_ROOT}/stage64/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/stage64/lib)
elseif(IS_DIRECTORY ${BOOST_ROOT}/stage/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/stage/lib)
elseif(IS_DIRECTORY ${BOOST_ROOT}/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/lib)
else()
message(WARNING "Did not find expected boost library dir. "
"Defaulting to ${BOOST_ROOT}")
set(BOOST_LIBRARYDIR ${BOOST_ROOT})
endif()
endif()
endif()
message(STATUS "BOOST_ROOT: ${BOOST_ROOT}")
message(STATUS "BOOST_LIBRARYDIR: ${BOOST_LIBRARYDIR}")
# uncomment the following as needed to debug FindBoost issues:
#set(Boost_DEBUG ON)
#[=========================================================[
boost dynamic libraries don't trivially support @rpath
linking right now (cmake's default), so just force
static linking for macos, or if requested on linux by flag
#]=========================================================]
if(static)
set(Boost_USE_STATIC_LIBS ON)
endif()
set(Boost_USE_MULTITHREADED ON)
if(static AND NOT APPLE)
set(Boost_USE_STATIC_RUNTIME ON)
else()
set(Boost_USE_STATIC_RUNTIME OFF)
endif()
# TBD:
# Boost_USE_DEBUG_RUNTIME: When ON, uses Boost libraries linked against the
find_package(Boost 1.86 REQUIRED
find_package(Boost 1.83 REQUIRED
COMPONENTS
chrono
container
@@ -58,12 +10,12 @@ find_package(Boost 1.86 REQUIRED
program_options
regex
system
iostreams
thread)
thread
)
add_library(ripple_boost INTERFACE)
add_library(Ripple::boost ALIAS ripple_boost)
if(is_xcode)
if(XCODE)
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
else()
@@ -83,6 +35,7 @@ target_link_libraries(ripple_boost
Boost::program_options
Boost::regex
Boost::system
Boost::iostreams
Boost::thread)
if(Boost_COMPILER)
target_link_libraries(ripple_boost INTERFACE Boost::disable_autolinking)

View File

@@ -1,28 +0,0 @@
#[===================================================================[
NIH dep: ed25519-donna
#]===================================================================]
add_library (ed25519-donna STATIC
external/ed25519-donna/ed25519.c)
target_include_directories (ed25519-donna
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/external>
$<INSTALL_INTERFACE:include>
PRIVATE
${CMAKE_CURRENT_SOURCE_DIR}/external/ed25519-donna)
#[=========================================================[
NOTE for macos:
https://github.com/floodyberry/ed25519-donna/issues/29
our source for ed25519-donna-portable.h has been
patched to workaround this.
#]=========================================================]
target_link_libraries (ed25519-donna PUBLIC OpenSSL::SSL)
add_library (NIH::ed25519-donna ALIAS ed25519-donna)
target_link_libraries (ripple_libs INTERFACE NIH::ed25519-donna)
#[===========================[
headers installation
#]===========================]
install (
FILES
external/ed25519-donna/ed25519.h
DESTINATION include/ed25519-donna)

File diff suppressed because it is too large Load Diff

View File

@@ -1,47 +0,0 @@
# - Try to find jemalloc
# Once done this will define
# JEMALLOC_FOUND - System has jemalloc
# JEMALLOC_INCLUDE_DIRS - The jemalloc include directories
# JEMALLOC_LIBRARIES - The libraries needed to use jemalloc
if(NOT USE_BUNDLED_JEMALLOC)
find_package(PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_check_modules(PC_JEMALLOC QUIET jemalloc)
endif()
else()
set(PC_JEMALLOC_INCLUDEDIR)
set(PC_JEMALLOC_INCLUDE_DIRS)
set(PC_JEMALLOC_LIBDIR)
set(PC_JEMALLOC_LIBRARY_DIRS)
set(LIMIT_SEARCH NO_DEFAULT_PATH)
endif()
set(JEMALLOC_DEFINITIONS ${PC_JEMALLOC_CFLAGS_OTHER})
find_path(JEMALLOC_INCLUDE_DIR jemalloc/jemalloc.h
PATHS ${PC_JEMALLOC_INCLUDEDIR} ${PC_JEMALLOC_INCLUDE_DIRS}
${LIMIT_SEARCH})
# If we're asked to use static linkage, add libjemalloc.a as a preferred library name.
if(JEMALLOC_USE_STATIC)
list(APPEND JEMALLOC_NAMES
"${CMAKE_STATIC_LIBRARY_PREFIX}jemalloc${CMAKE_STATIC_LIBRARY_SUFFIX}")
endif()
list(APPEND JEMALLOC_NAMES jemalloc)
find_library(JEMALLOC_LIBRARY NAMES ${JEMALLOC_NAMES}
HINTS ${PC_JEMALLOC_LIBDIR} ${PC_JEMALLOC_LIBRARY_DIRS}
${LIMIT_SEARCH})
set(JEMALLOC_LIBRARIES ${JEMALLOC_LIBRARY})
set(JEMALLOC_INCLUDE_DIRS ${JEMALLOC_INCLUDE_DIR})
include(FindPackageHandleStandardArgs)
# handle the QUIETLY and REQUIRED arguments and set JEMALLOC_FOUND to TRUE
# if all listed variables are TRUE
find_package_handle_standard_args(JeMalloc DEFAULT_MSG
JEMALLOC_LIBRARY JEMALLOC_INCLUDE_DIR)
mark_as_advanced(JEMALLOC_INCLUDE_DIR JEMALLOC_LIBRARY)

View File

@@ -1,22 +0,0 @@
find_package (PkgConfig REQUIRED)
pkg_search_module (libarchive_PC QUIET libarchive>=3.4.3)
if(static)
set(LIBARCHIVE_LIB libarchive.a)
else()
set(LIBARCHIVE_LIB archive)
endif()
find_library (archive
NAMES ${LIBARCHIVE_LIB}
HINTS
${libarchive_PC_LIBDIR}
${libarchive_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LIBARCHIVE_INCLUDE_DIR
NAMES archive.h
HINTS
${libarchive_PC_INCLUDEDIR}
${libarchive_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (lz4_PC QUIET liblz4>=1.9)
endif ()
if(static)
set(LZ4_LIB liblz4.a)
else()
set(LZ4_LIB lz4.so)
endif()
find_library (lz4
NAMES ${LZ4_LIB}
HINTS
${lz4_PC_LIBDIR}
${lz4_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LZ4_INCLUDE_DIR
NAMES lz4.h
HINTS
${lz4_PC_INCLUDEDIR}
${lz4_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (secp256k1_PC QUIET libsecp256k1)
endif ()
if(static)
set(SECP256K1_LIB libsecp256k1.a)
else()
set(SECP256K1_LIB secp256k1)
endif()
find_library(secp256k1
NAMES ${SECP256K1_LIB}
HINTS
${secp256k1_PC_LIBDIR}
${secp256k1_PC_LIBRARY_PATHS}
NO_DEFAULT_PATH)
find_path (SECP256K1_INCLUDE_DIR
NAMES secp256k1.h
HINTS
${secp256k1_PC_INCLUDEDIR}
${secp256k1_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (snappy_PC QUIET snappy>=1.1.7)
endif ()
if(static)
set(SNAPPY_LIB libsnappy.a)
else()
set(SNAPPY_LIB libsnappy.so)
endif()
find_library (snappy
NAMES ${SNAPPY_LIB}
HINTS
${snappy_PC_LIBDIR}
${snappy_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SNAPPY_INCLUDE_DIR
NAMES snappy.h
HINTS
${snappy_PC_INCLUDEDIR}
${snappy_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,19 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
# TBD - currently no soci pkgconfig
#pkg_search_module (soci_PC QUIET libsoci_core>=3.2)
endif ()
if(static)
set(SOCI_LIB libsoci.a)
else()
set(SOCI_LIB libsoci_core.so)
endif()
find_library (soci
NAMES ${SOCI_LIB})
find_path (SOCI_INCLUDE_DIR
NAMES soci/soci.h)
message("SOCI FOUND AT: ${SOCI_LIB}")

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (sqlite_PC QUIET sqlite3>=3.26.0)
endif ()
if(static)
set(SQLITE_LIB libsqlite3.a)
else()
set(SQLITE_LIB sqlite3.so)
endif()
find_library (sqlite3
NAMES ${SQLITE_LIB}
HINTS
${sqlite_PC_LIBDIR}
${sqlite_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SQLITE_INCLUDE_DIR
NAMES sqlite3.h
HINTS
${sqlite_PC_INCLUDEDIR}
${sqlite_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,163 +0,0 @@
#[===================================================================[
NIH dep: libarchive
#]===================================================================]
option (local_libarchive "use local build of libarchive." OFF)
add_library (archive_lib UNKNOWN IMPORTED GLOBAL)
if (NOT local_libarchive)
if (NOT WIN32)
find_package(libarchive_pc REQUIRED)
endif ()
if (archive)
message (STATUS "Found libarchive using pkg-config. Using ${archive}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${archive}
IMPORTED_LOCATION_RELEASE
${archive}
INTERFACE_INCLUDE_DIRECTORIES
${LIBARCHIVE_INCLUDE_DIR})
# pkg-config can return extra info for static lib linking
# this is probably needed/useful generally, but apply
# to APPLE for now (mostly for homebrew)
if (APPLE AND static AND libarchive_PC_STATIC_LIBRARIES)
message(STATUS "NOTE: libarchive static libs: ${libarchive_PC_STATIC_LIBRARIES}")
# also, APPLE seems to need iconv...maybe linux does too (TBD)
target_link_libraries (archive_lib
INTERFACE iconv ${libarchive_PC_STATIC_LIBRARIES})
endif ()
else ()
## now try searching using the minimal find module that cmake provides
find_package(LibArchive 3.4.3 QUIET)
if (LibArchive_FOUND)
if (static)
# find module doesn't find static libs currently, so we re-search
get_filename_component(_loc ${LibArchive_LIBRARY} DIRECTORY)
find_library(_la_static
NAMES libarchive.a archive_static.lib archive.lib
PATHS ${_loc})
if (_la_static)
set (_la_lib ${_la_static})
else ()
message (WARNING "unable to find libarchive static lib - switching to local build")
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
else ()
set (_la_lib ${LibArchive_LIBRARY})
endif ()
if (NOT local_libarchive)
message (STATUS "Found libarchive using module/config. Using ${_la_lib}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${_la_lib}
IMPORTED_LOCATION_RELEASE
${_la_lib}
INTERFACE_INCLUDE_DIRECTORIES
${LibArchive_INCLUDE_DIRS})
endif ()
else ()
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif()
if (local_libarchive)
set (lib_post "")
if (MSVC)
set (lib_post "_static")
endif ()
ExternalProject_Add (libarchive
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libarchive/libarchive.git
GIT_TAG v3.4.3
CMAKE_ARGS
# passing the compiler seems to be needed for windows CI, sadly
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DENABLE_LZ4=ON
-ULZ4_*
-DLZ4_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
# because we are building a static lib, this lz4 library doesn't
# actually matter since you can't generally link static libs to other static
# libs. The include files are needed, but the library itself is not (until
# we link our application, at which point we use the lz4 we built above).
# nonetheless, we need to provide a library to libarchive else it will
# NOT include lz4 support when configuring
-DLZ4_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-DENABLE_WERROR=OFF
-DENABLE_TAR=OFF
-DENABLE_TAR_SHARED=OFF
-DENABLE_INSTALL=ON
-DENABLE_NETTLE=OFF
-DENABLE_OPENSSL=OFF
-DENABLE_LZO=OFF
-DENABLE_LZMA=OFF
-DENABLE_ZLIB=OFF
-DENABLE_BZip2=OFF
-DENABLE_LIBXML2=OFF
-DENABLE_EXPAT=OFF
-DENABLE_PCREPOSIX=OFF
-DENABLE_LibGCC=OFF
-DENABLE_CNG=OFF
-DENABLE_CPIO=OFF
-DENABLE_CPIO_SHARED=OFF
-DENABLE_CAT=OFF
-DENABLE_CAT_SHARED=OFF
-DENABLE_XATTR=OFF
-DENABLE_ACL=OFF
-DENABLE_ICONV=OFF
-DENABLE_TEST=OFF
-DENABLE_COVERAGE=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target archive_static
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/libarchive/$<CONFIG>/${ep_lib_prefix}archive${lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/libarchive
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (libarchive BINARY_DIR)
ExternalProject_Get_Property (libarchive SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (libarchive)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/libarchive)
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/libarchive
INTERFACE_COMPILE_DEFINITIONS
LIBARCHIVE_STATIC)
endif()
add_dependencies (archive_lib libarchive)
target_link_libraries (archive_lib INTERFACE lz4_lib)
target_link_libraries (ripple_libs INTERFACE archive_lib)
exclude_if_included (libarchive)
exclude_if_included (archive_lib)

View File

@@ -1,79 +0,0 @@
#[===================================================================[
NIH dep: lz4
#]===================================================================]
add_library (lz4_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(lz4)
endif()
if(lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${lz4}
IMPORTED_LOCATION_RELEASE
${lz4}
INTERFACE_INCLUDE_DIRECTORIES
${LZ4_INCLUDE_DIR})
else()
ExternalProject_Add (lz4
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/lz4/lz4.git
GIT_TAG v1.9.2
SOURCE_SUBDIR contrib/cmake_unofficial
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_STATIC_LIBS=ON
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target lz4_static
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}lz4$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}lz4${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}lz4_d${ep_lib_suffix}
)
ExternalProject_Get_Property (lz4 BINARY_DIR)
ExternalProject_Get_Property (lz4 SOURCE_DIR)
file (MAKE_DIRECTORY ${SOURCE_DIR}/lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}lz4_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}lz4${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/lib)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (lz4)
endif ()
add_dependencies (lz4_lib lz4)
target_link_libraries (ripple_libs INTERFACE lz4_lib)
exclude_if_included (lz4)
endif()
exclude_if_included (lz4_lib)

View File

@@ -1,31 +0,0 @@
#[===================================================================[
NIH dep: nudb
NuDB is header-only, thus is an INTERFACE lib in CMake.
TODO: move the library definition into NuDB repo and add
proper targets and export/install
#]===================================================================]
if (is_root_project) # NuDB not needed in the case of xrpl_core inclusion build
add_library (nudb INTERFACE)
FetchContent_Declare(
nudb_src
GIT_REPOSITORY https://github.com/CPPAlliance/NuDB.git
GIT_TAG 2.0.5
)
FetchContent_GetProperties(nudb_src)
if(NOT nudb_src_POPULATED)
message (STATUS "Pausing to download NuDB...")
FetchContent_Populate(nudb_src)
endif()
file(TO_CMAKE_PATH "${nudb_src_SOURCE_DIR}" nudb_src_SOURCE_DIR)
# specify as system includes so as to avoid warnings
target_include_directories (nudb SYSTEM INTERFACE ${nudb_src_SOURCE_DIR}/include)
target_link_libraries (nudb
INTERFACE
Boost::thread
Boost::system)
add_library (NIH::nudb ALIAS nudb)
target_link_libraries (ripple_libs INTERFACE NIH::nudb)
endif ()

View File

@@ -1,48 +0,0 @@
#[===================================================================[
NIH dep: openssl
#]===================================================================]
#[===============================================[
OPENSSL_ROOT_DIR is the only variable that
FindOpenSSL honors for locating, so convert any
OPENSSL_ROOT vars to this
#]===============================================]
if (NOT DEFINED OPENSSL_ROOT_DIR)
if (DEFINED ENV{OPENSSL_ROOT})
set (OPENSSL_ROOT_DIR $ENV{OPENSSL_ROOT})
elseif (HOMEBREW)
execute_process (COMMAND ${HOMEBREW} --prefix openssl
OUTPUT_VARIABLE OPENSSL_ROOT_DIR
OUTPUT_STRIP_TRAILING_WHITESPACE)
endif ()
file (TO_CMAKE_PATH "${OPENSSL_ROOT_DIR}" OPENSSL_ROOT_DIR)
endif ()
if (static)
set (OPENSSL_USE_STATIC_LIBS ON)
endif ()
set (OPENSSL_MSVC_STATIC_RT ON)
find_package (OpenSSL 1.1.1 REQUIRED)
target_link_libraries (ripple_libs
INTERFACE
OpenSSL::SSL
OpenSSL::Crypto)
# disable SSLv2...this can also be done when building/configuring OpenSSL
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2)
#[=========================================================[
https://gitlab.kitware.com/cmake/cmake/issues/16885
depending on how openssl is built, it might depend
on zlib. In fact, the openssl find package should
figure this out for us, but it does not currently...
so let's add zlib ourselves to the lib list
TODO: investigate linking to static zlib for static
build option
#]=========================================================]
find_package (ZLIB)
set (has_zlib FALSE)
if (TARGET ZLIB::ZLIB)
set_target_properties(OpenSSL::Crypto PROPERTIES
INTERFACE_LINK_LIBRARIES ZLIB::ZLIB)
set (has_zlib TRUE)
endif ()

View File

@@ -1,70 +0,0 @@
if(reporting)
find_package(PostgreSQL)
if(NOT PostgreSQL_FOUND)
message("find_package did not find postgres")
find_library(postgres NAMES pq libpq libpq-dev pq-dev postgresql-devel)
find_path(libpq-fe NAMES libpq-fe.h PATH_SUFFIXES postgresql pgsql include)
if(NOT libpq-fe_FOUND OR NOT postgres_FOUND)
message("No system installed Postgres found. Will build")
add_library(postgres SHARED IMPORTED GLOBAL)
add_library(pgport SHARED IMPORTED GLOBAL)
add_library(pgcommon SHARED IMPORTED GLOBAL)
ExternalProject_Add(postgres_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/postgres/postgres.git
GIT_TAG REL_14_5
CONFIGURE_COMMAND ./configure --without-readline > /dev/null
BUILD_COMMAND ${CMAKE_COMMAND} -E env --unset=MAKELEVEL make
UPDATE_COMMAND ""
BUILD_IN_SOURCE 1
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/src/interfaces/libpq/${ep_lib_prefix}pq.a
<BINARY_DIR>/src/common/${ep_lib_prefix}pgcommon.a
<BINARY_DIR>/src/port/${ep_lib_prefix}pgport.a
LOG_BUILD TRUE
)
ExternalProject_Get_Property (postgres_src SOURCE_DIR)
ExternalProject_Get_Property (postgres_src BINARY_DIR)
set (postgres_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${postgres_src_SOURCE_DIR})
list(APPEND INCLUDE_DIRS
${SOURCE_DIR}/src/include
${SOURCE_DIR}/src/interfaces/libpq
)
set_target_properties(postgres PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/interfaces/libpq/${ep_lib_prefix}pq.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
set_target_properties(pgcommon PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/common/${ep_lib_prefix}pgcommon.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
set_target_properties(pgport PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/port/${ep_lib_prefix}pgport.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
add_dependencies(postgres postgres_src)
add_dependencies(pgcommon postgres_src)
add_dependencies(pgport postgres_src)
file(TO_CMAKE_PATH "${postgres_src_SOURCE_DIR}" postgres_src_SOURCE_DIR)
target_link_libraries(ripple_libs INTERFACE postgres pgcommon pgport)
else()
message("Found system installed Postgres via find_libary")
target_include_directories(ripple_libs INTERFACE ${libpq-fe})
target_link_libraries(ripple_libs INTERFACE ${postgres})
endif()
else()
message("Found system installed Postgres via find_package")
target_include_directories(ripple_libs INTERFACE ${PostgreSQL_INCLUDE_DIRS})
target_link_libraries(ripple_libs INTERFACE ${PostgreSQL_LIBRARIES})
endif()
endif()

View File

@@ -1,156 +0,0 @@
#[===================================================================[
import protobuf (lib and compiler) and create a lib
from our proto message definitions. If the system protobuf
is not found, fallback on EP to download and build a version
from official source.
#]===================================================================]
if (static)
set (Protobuf_USE_STATIC_LIBS ON)
endif ()
find_package (Protobuf 3.8)
if (is_multiconfig)
set(protobuf_protoc_lib ${Protobuf_PROTOC_LIBRARIES})
else ()
string(TOUPPER ${CMAKE_BUILD_TYPE} upper_cmake_build_type)
set(protobuf_protoc_lib ${Protobuf_PROTOC_LIBRARY_${upper_cmake_build_type}})
endif ()
if (local_protobuf OR NOT (Protobuf_FOUND AND Protobuf_PROTOC_EXECUTABLE AND protobuf_protoc_lib))
include (GNUInstallDirs)
message (STATUS "using local protobuf build.")
set(protobuf_reqs Protobuf_PROTOC_EXECUTABLE protobuf_protoc_lib)
foreach(lib ${protobuf_reqs})
if(NOT ${lib})
message(STATUS "Couldn't find ${lib}")
endif()
endforeach()
if (WIN32)
# protobuf prepends lib even on windows
set (pbuf_lib_pre "lib")
else ()
set (pbuf_lib_pre ${ep_lib_prefix})
endif ()
# for the external project build of protobuf, we currently ignore the
# static option and always build static libs here. This is consistent
# with our other EP builds. Dynamic libs in an EP would add complexity
# because we'd need to get them into the runtime path, and probably
# install them.
ExternalProject_Add (protobuf_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/protocolbuffers/protobuf.git
GIT_TAG v3.8.0
SOURCE_SUBDIR cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-Dprotobuf_BUILD_TESTS=OFF
-Dprotobuf_BUILD_EXAMPLES=OFF
-Dprotobuf_BUILD_PROTOC_BINARIES=ON
-Dprotobuf_MSVC_STATIC_RUNTIME=ON
-DBUILD_SHARED_LIBS=OFF
-Dprotobuf_BUILD_SHARED_LIBS=OFF
-DCMAKE_DEBUG_POSTFIX=_d
-Dprotobuf_DEBUG_POSTFIX=_d
-Dprotobuf_WITH_ZLIB=$<IF:$<BOOL:${has_zlib}>,ON,OFF>
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}
)
ExternalProject_Get_Property (protobuf_src BINARY_DIR)
ExternalProject_Get_Property (protobuf_src SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (protobuf_src)
endif ()
exclude_if_included (protobuf_src)
if (NOT TARGET protobuf::libprotobuf)
add_library (protobuf::libprotobuf STATIC IMPORTED GLOBAL)
endif ()
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (protobuf::libprotobuf PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotobuf protobuf_src)
exclude_if_included (protobuf::libprotobuf)
if (NOT TARGET protobuf::libprotoc)
add_library (protobuf::libprotoc STATIC IMPORTED GLOBAL)
endif ()
set_target_properties (protobuf::libprotoc PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotoc protobuf_src)
exclude_if_included (protobuf::libprotoc)
if (NOT TARGET protobuf::protoc)
add_executable (protobuf::protoc IMPORTED)
exclude_if_included (protobuf::protoc)
endif ()
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${BINARY_DIR}/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (protobuf::protoc protobuf_src)
else ()
if (NOT TARGET protobuf::protoc)
if (EXISTS "${Protobuf_PROTOC_EXECUTABLE}")
add_executable (protobuf::protoc IMPORTED)
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${Protobuf_PROTOC_EXECUTABLE}")
else ()
message (FATAL_ERROR "Protobuf import failed")
endif ()
endif ()
endif ()
set(output_dir ${CMAKE_BINARY_DIR}/proto_gen)
file(MAKE_DIRECTORY ${output_dir})
set(ccbd ${CMAKE_CURRENT_BINARY_DIR})
set(CMAKE_CURRENT_BINARY_DIR ${output_dir})
protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS src/ripple/proto/ripple.proto)
set(CMAKE_CURRENT_BINARY_DIR ${ccbd})
target_include_directories(xrpl_core SYSTEM PUBLIC
# The generated implementation imports the header relative to the output
# directory.
$<BUILD_INTERFACE:${output_dir}>
$<BUILD_INTERFACE:${output_dir}/src>
)
target_sources(xrpl_core PRIVATE ${output_dir}/src/ripple/proto/ripple.pb.cc)
install(
FILES ${output_dir}/src/ripple/proto/ripple.pb.h
DESTINATION include/ripple/proto)
target_link_libraries(xrpl_core PUBLIC protobuf::libprotobuf)
target_compile_options(xrpl_core
PUBLIC
$<$<BOOL:${is_xcode}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>
)

View File

@@ -1,177 +0,0 @@
#[===================================================================[
NIH dep: rocksdb
#]===================================================================]
add_library (rocksdb_lib UNKNOWN IMPORTED GLOBAL)
set_target_properties (rocksdb_lib
PROPERTIES INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1)
option (local_rocksdb "use local build of rocksdb." OFF)
if (NOT local_rocksdb)
find_package (RocksDB 6.27 QUIET CONFIG)
if (TARGET RocksDB::rocksdb)
message (STATUS "Found RocksDB using config.")
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_DEBUG)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_RELEASE)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${_rockslib_l})
endif ()
get_target_property (_rockslib_i RocksDB::rocksdb INTERFACE_INCLUDE_DIRECTORIES)
if (_rockslib_i)
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${_rockslib_i})
endif ()
target_link_libraries (ripple_libs INTERFACE RocksDB::rocksdb)
else ()
# using a find module with rocksdb is difficult because
# you have no idea how it was configured (transitive dependencies).
# the code below will generally find rocksdb using the module, but
# will then result in linker errors for static linkage since the
# transitive dependencies are unknown. force local build here for now, but leave the code as
# a placeholder for future investigation.
if (static)
set (local_rocksdb ON CACHE BOOL "" FORCE)
# TBD if there is some way to extract transitive deps..then:
#set (RocksDB_USE_STATIC ON)
else ()
find_package (RocksDB 6.27 MODULE)
if (ROCKSDB_FOUND)
if (RocksDB_LIBRARY_DEBUG)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${RocksDB_LIBRARY_DEBUG})
endif ()
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${RocksDB_INCLUDE_DIRS})
else ()
set (local_rocksdb ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif ()
endif ()
if (local_rocksdb)
message (STATUS "Using local build of RocksDB.")
ExternalProject_Add (rocksdb
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/facebook/rocksdb.git
GIT_TAG v6.27.3
PATCH_COMMAND
# only used by windows build
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/cmake/rocks_thirdparty.inc
<SOURCE_DIR>/thirdparty.inc
COMMAND
# fixup their build version file to keep the values
# from changing always
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/cmake/rocksdb_build_version.cc.in
<SOURCE_DIR>/util/build_version.cc.in
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DWITH_JEMALLOC=$<IF:$<BOOL:${jemalloc}>,ON,OFF>
-DWITH_SNAPPY=ON
-DWITH_LZ4=ON
-DWITH_ZLIB=OFF
-DUSE_RTTI=ON
-DWITH_ZSTD=OFF
-DWITH_GFLAGS=OFF
-DWITH_BZ2=OFF
-ULZ4_*
-Ulz4_*
-Dlz4_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dlz4_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-Dlz4_FOUND=ON
-USNAPPY_*
-Usnappy_*
-USnappy_*
-Dsnappy_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:snappy_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dsnappy_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_RELEASE>>
-Dsnappy_FOUND=ON
-DSnappy_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:snappy_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DSnappy_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_RELEASE>>
-DSnappy_FOUND=ON
-DWITH_MD_LIBRARY=OFF
-DWITH_RUNTIME_DEBUG=$<IF:$<CONFIG:Debug>,ON,OFF>
-DFAIL_ON_WARNINGS=OFF
-DWITH_ASAN=OFF
-DWITH_TSAN=OFF
-DWITH_UBSAN=OFF
-DWITH_NUMA=OFF
-DWITH_TBB=OFF
-DWITH_WINDOWS_UTF8_FILENAMES=OFF
-DWITH_XPRESS=OFF
-DPORTABLE=ON
-DFORCE_SSE42=OFF
-DDISABLE_STALL_NOTIF=OFF
-DOPTDBG=ON
-DROCKSDB_LITE=OFF
-DWITH_FALLOCATE=ON
-DWITH_LIBRADOS=OFF
-DWITH_JNI=OFF
-DROCKSDB_INSTALL_ON_WINDOWS=OFF
-DWITH_TESTS=OFF
-DWITH_TOOLS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -MP /DNDEBUG"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-DNDEBUG"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}rocksdb$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
LIST_SEPARATOR ::
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS snappy_lib lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}rocksdb${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
)
ExternalProject_Get_Property (rocksdb BINARY_DIR)
ExternalProject_Get_Property (rocksdb SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (rocksdb)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
set_target_properties (rocksdb_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}rocksdb${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies (rocksdb_lib rocksdb)
exclude_if_included (rocksdb)
endif ()
target_link_libraries (rocksdb_lib
INTERFACE
snappy_lib
lz4_lib
$<$<BOOL:${MSVC}>:rpcrt4>)
exclude_if_included (rocksdb_lib)
target_link_libraries (ripple_libs INTERFACE rocksdb_lib)

View File

@@ -1,58 +0,0 @@
#[===================================================================[
NIH dep: secp256k1
#]===================================================================]
add_library (secp256k1_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(secp256k1)
endif()
if(secp256k1)
set_target_properties (secp256k1_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${secp256k1}
IMPORTED_LOCATION_RELEASE
${secp256k1}
INTERFACE_INCLUDE_DIRECTORIES
${SECP256K1_INCLUDE_DIR})
add_library (secp256k1 ALIAS secp256k1_lib)
add_library (NIH::secp256k1 ALIAS secp256k1_lib)
else()
set(INSTALL_SECP256K1 true)
add_library (secp256k1 STATIC
external/secp256k1/src/secp256k1.c)
target_compile_definitions (secp256k1
PRIVATE
USE_NUM_NONE
USE_FIELD_10X26
USE_FIELD_INV_BUILTIN
USE_SCALAR_8X32
USE_SCALAR_INV_BUILTIN)
target_include_directories (secp256k1
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/external>
$<INSTALL_INTERFACE:include>
PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/external/secp256k1)
target_compile_options (secp256k1
PRIVATE
$<$<BOOL:${MSVC}>:-wd4319>
$<$<NOT:$<BOOL:${MSVC}>>:
-Wno-deprecated-declarations
-Wno-unused-function
>
$<$<BOOL:${is_gcc}>:-Wno-nonnull-compare>)
target_link_libraries (ripple_libs INTERFACE NIH::secp256k1)
#[===========================[
headers installation
#]===========================]
install (
FILES
external/secp256k1/include/secp256k1.h
DESTINATION include/secp256k1/include)
add_library (NIH::secp256k1 ALIAS secp256k1)
endif()

View File

@@ -1,77 +0,0 @@
#[===================================================================[
NIH dep: snappy
#]===================================================================]
add_library (snappy_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(snappy)
endif()
if(snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${snappy}
IMPORTED_LOCATION_RELEASE
${snappy}
INTERFACE_INCLUDE_DIRECTORIES
${SNAPPY_INCLUDE_DIR})
else()
ExternalProject_Add (snappy
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/google/snappy.git
GIT_TAG 1.1.7
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DSNAPPY_BUILD_TESTS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}snappy$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E copy_if_different <BINARY_DIR>/config.h <BINARY_DIR>/snappy-stubs-public.h <SOURCE_DIR>
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}snappy${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}snappy_d${ep_lib_suffix}
)
ExternalProject_Get_Property (snappy BINARY_DIR)
ExternalProject_Get_Property (snappy SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (snappy)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}snappy_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}snappy${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
endif()
add_dependencies (snappy_lib snappy)
target_link_libraries (ripple_libs INTERFACE snappy_lib)
exclude_if_included (snappy)
exclude_if_included (snappy_lib)

View File

@@ -1,165 +0,0 @@
#[===================================================================[
NIH dep: soci
#]===================================================================]
foreach (_comp core empty sqlite3)
add_library ("soci_${_comp}" STATIC IMPORTED GLOBAL)
endforeach ()
if (NOT WIN32)
find_package(soci)
endif()
if (soci)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${soci}
IMPORTED_LOCATION_RELEASE
${soci}
INTERFACE_INCLUDE_DIRECTORIES
${SOCI_INCLUDE_DIR})
endforeach ()
else()
set (soci_lib_pre ${ep_lib_prefix})
set (soci_lib_post "")
if (WIN32)
# for some reason soci on windows still prepends lib (non-standard)
set (soci_lib_pre lib)
# this version in the name might change if/when we change versions of soci
set (soci_lib_post "_4_0")
endif ()
get_target_property (_boost_incs Boost::date_time INTERFACE_INCLUDE_DIRECTORIES)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION)
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_DEBUG)
endif ()
ExternalProject_Add (soci
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/SOCI/soci.git
GIT_TAG 04e1870294918d20761736743bb6136314c42dd5
# We had an issue with soci integer range checking for boost::optional
# and needed to remove the exception that SOCI throws in this case.
# This is *probably* a bug in SOCI, but has never been investigated more
# nor reported to the maintainers.
# This cmake script comments out the lines in question.
# This patch process is likely fragile and should be reviewed carefully
# whenever we update the GIT_TAG above.
PATCH_COMMAND
${CMAKE_COMMAND} -D RIPPLED_SOURCE=${CMAKE_CURRENT_SOURCE_DIR}
-P ${CMAKE_CURRENT_SOURCE_DIR}/cmake/soci_patch.cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_PREFIX_PATH=${CMAKE_BINARY_DIR}/sqlite3
-DCMAKE_MODULE_PATH=${CMAKE_CURRENT_SOURCE_DIR}/cmake
-DCMAKE_INCLUDE_PATH=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DCMAKE_LIBRARY_PATH=${sqlite_BINARY_DIR}
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DSOCI_CXX_C11=ON
-DSOCI_STATIC=ON
-DSOCI_LIBDIR=lib
-DSOCI_SHARED=OFF
-DSOCI_TESTS=OFF
# hacks to workaround the fact that soci doesn't currently use
# boost imported targets in its cmake. If they switch to
# proper imported targets, this next line can be removed
# (as well as the get_property above that sets _boost_incs)
-DBoost_INCLUDE_DIRS=$<JOIN:${_boost_incs},::>
-DBoost_INCLUDE_DIR=$<JOIN:${_boost_incs},::>
-DBOOST_ROOT=${BOOST_ROOT}
-DWITH_BOOST=ON
-DBoost_FOUND=ON
-DBoost_NO_BOOST_CMAKE=ON
-DBoost_DATE_TIME_FOUND=ON
-DSOCI_HAVE_BOOST=ON
-DSOCI_HAVE_BOOST_DATE_TIME=ON
-DBoost_DATE_TIME_LIBRARY=${_boost_dt}
-DSOCI_DB2=OFF
-DSOCI_FIREBIRD=OFF
-DSOCI_MYSQL=OFF
-DSOCI_ODBC=OFF
-DSOCI_ORACLE=OFF
-DSOCI_POSTGRESQL=OFF
-DSOCI_SQLITE3=ON
-DSQLITE3_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DSQLITE3_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_RELEASE>>
$<$<BOOL:${APPLE}>:-DCMAKE_FIND_FRAMEWORK=LAST>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations"
>
# SEE: https://github.com/SOCI/soci/issues/640
$<$<AND:$<BOOL:${is_gcc}>,$<VERSION_GREATER_EQUAL:${CMAKE_CXX_COMPILER_VERSION},8>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations -Wno-error=format-overflow -Wno-format-overflow -Wno-error=format-truncation"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_core${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_empty${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_sqlite3${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS sqlite
BUILD_BYPRODUCTS
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (soci BINARY_DIR)
ExternalProject_Get_Property (soci SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (soci)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
file (MAKE_DIRECTORY ${BINARY_DIR}/include)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
"${SOURCE_DIR}/include;${BINARY_DIR}/include")
add_dependencies ("soci_${_comp}" soci) # something has to depend on the ExternalProject to trigger it
target_link_libraries (ripple_libs INTERFACE "soci_${_comp}")
if (NOT _comp STREQUAL "core")
target_link_libraries ("soci_${_comp}" INTERFACE soci_core)
endif ()
endforeach ()
endif()
foreach (_comp core empty sqlite3)
exclude_if_included ("soci_${_comp}")
endforeach ()
exclude_if_included (soci)

View File

@@ -1,93 +0,0 @@
#[===================================================================[
NIH dep: sqlite
#]===================================================================]
add_library (sqlite STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(sqlite)
endif()
if(sqlite3)
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${sqlite3}
IMPORTED_LOCATION_RELEASE
${sqlite3}
INTERFACE_INCLUDE_DIRECTORIES
${SQLITE_INCLUDE_DIR})
else()
ExternalProject_Add (sqlite3
PREFIX ${nih_cache_path}
# sqlite doesn't use git, but it provides versioned tarballs
URL https://www.sqlite.org/2018/sqlite-amalgamation-3260000.zip
http://www.sqlite.org/2018/sqlite-amalgamation-3260000.zip
https://www2.sqlite.org/2018/sqlite-amalgamation-3260000.zip
http://www2.sqlite.org/2018/sqlite-amalgamation-3260000.zip
# ^^^ version is apparent in the URL: 3260000 => 3.26.0
URL_HASH SHA256=de5dcab133aa339a4cf9e97c40aa6062570086d6085d8f9ad7bc6ddf8a52096e
# Don't need to worry about MITM attacks too much because the download
# is checked against a strong hash
TLS_VERIFY false
# we wrote a very simple CMake file to build sqlite
# so that's what we copy here so that we can build with
# CMake. sqlite doesn't generally provided a build system
# for the single amalgamation source file.
PATCH_COMMAND
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/cmake/CMake_sqlite3.txt
<SOURCE_DIR>/CMakeLists.txt
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}sqlite3$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}sqlite3${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
)
ExternalProject_Get_Property (sqlite3 BINARY_DIR)
ExternalProject_Get_Property (sqlite3 SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (sqlite3)
endif ()
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}sqlite3${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
add_dependencies (sqlite sqlite3)
exclude_if_included (sqlite3)
endif()
target_link_libraries (sqlite INTERFACE $<$<NOT:$<BOOL:${MSVC}>>:dl>)
target_link_libraries (ripple_libs INTERFACE sqlite)
exclude_if_included (sqlite)
set(sqlite_BINARY_DIR ${BINARY_DIR})

View File

@@ -1,84 +1 @@
#[===================================================================[
NIH dep: wasmedge: web assembly runtime for hooks.
#]===================================================================]
find_package(Curses)
if(CURSES_FOUND)
include_directories(${CURSES_INCLUDE_DIR})
target_link_libraries(ripple_libs INTERFACE ${CURSES_LIBRARY})
else()
message(WARNING "CURSES library not found... (only important for mac builds)")
endif()
find_package(LLVM REQUIRED CONFIG)
message(STATUS "Found LLVM ${LLVM_PACKAGE_VERSION}")
message(STATUS "Using LLVMConfig.cmake in: ${LLVM_DIR}")
ExternalProject_Add (wasmedge_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/WasmEdge/WasmEdge.git
GIT_TAG 0.11.2
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
-DWASMEDGE_BUILD_SHARED_LIB=OFF
-DWASMEDGE_BUILD_STATIC_LIB=ON
-DWASMEDGE_BUILD_AOT_RUNTIME=ON
-DWASMEDGE_FORCE_DISABLE_LTO=ON
-DWASMEDGE_LINK_LLVM_STATIC=ON
-DWASMEDGE_LINK_TOOLS_STATIC=ON
-DWASMEDGE_BUILD_PLUGINS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DLLVM_DIR=${LLVM_DIR}
-DLLVM_LIBRARY_DIR=${LLVM_LIBRARY_DIR}
-DLLVM_ENABLE_TERMINFO=OFF
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP -march=native"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_CONFIGURE ON
LOG_BUILD ON
LOG_CONFIGURE ON
COMMAND
pwd
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/lib/api/libwasmedge.a
)
add_library (wasmedge STATIC IMPORTED GLOBAL)
ExternalProject_Get_Property (wasmedge_src BINARY_DIR)
ExternalProject_Get_Property (wasmedge_src SOURCE_DIR)
set (wasmedge_src_BINARY_DIR "${BINARY_DIR}")
add_dependencies (wasmedge wasmedge_src)
execute_process(
COMMAND
mkdir -p "${wasmedge_src_BINARY_DIR}/include/api"
)
set_target_properties (wasmedge PROPERTIES
IMPORTED_LOCATION_DEBUG
"${wasmedge_src_BINARY_DIR}/lib/api/libwasmedge.a"
IMPORTED_LOCATION_RELEASE
"${wasmedge_src_BINARY_DIR}/lib/api/libwasmedge.a"
INTERFACE_INCLUDE_DIRECTORIES
"${wasmedge_src_BINARY_DIR}/include/api/"
)
target_link_libraries (ripple_libs INTERFACE wasmedge)
#RH NOTE: some compilers / versions of some libraries need these, most don't
find_library(XAR_LIBRARY NAMES xar)
if(XAR_LIBRARY)
target_link_libraries(ripple_libs INTERFACE ${XAR_LIBRARY})
else()
message(WARNING "xar library not found... (only important for mac builds)")
endif()
add_library (wasmedge::wasmedge ALIAS wasmedge)
find_package(wasmedge REQUIRED)

View File

@@ -1,167 +0,0 @@
if(reporting)
find_library(cassandra NAMES cassandra)
if(NOT cassandra)
message("System installed Cassandra cpp driver not found. Will build")
find_library(zlib NAMES zlib1g-dev zlib-devel zlib z)
if(NOT zlib)
message("zlib not found. will build")
add_library(zlib STATIC IMPORTED GLOBAL)
ExternalProject_Add(zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.12
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}z.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (zlib_src SOURCE_DIR)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${zlib_src_SOURCE_DIR}/include)
set_target_properties (zlib PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}z.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(zlib zlib_src)
file(TO_CMAKE_PATH "${zlib_src_SOURCE_DIR}" zlib_src_SOURCE_DIR)
endif()
find_library(krb5 NAMES krb5-dev libkrb5-dev)
if(NOT krb5)
message("krb5 not found. will build")
add_library(krb5 STATIC IMPORTED GLOBAL)
ExternalProject_Add(krb5_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/krb5/krb5.git
GIT_TAG krb5-1.20-final
UPDATE_COMMAND ""
CONFIGURE_COMMAND autoreconf src && CFLAGS=-fcommon ./src/configure --enable-static --disable-shared > /dev/null
BUILD_IN_SOURCE 1
BUILD_COMMAND make
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <SOURCE_DIR>/lib/${ep_lib_prefix}krb5.a
LOG_BUILD TRUE
)
ExternalProject_Get_Property (krb5_src SOURCE_DIR)
ExternalProject_Get_Property (krb5_src BINARY_DIR)
set (krb5_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${krb5_src_SOURCE_DIR}/include)
set_target_properties (krb5 PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/lib/${ep_lib_prefix}krb5.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(krb5 krb5_src)
file(TO_CMAKE_PATH "${krb5_src_SOURCE_DIR}" krb5_src_SOURCE_DIR)
endif()
find_library(libuv1 NAMES uv1 libuv1 liubuv1-dev libuv1:amd64)
if(NOT libuv1)
message("libuv1 not found, will build")
add_library(libuv1 STATIC IMPORTED GLOBAL)
ExternalProject_Add(libuv_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libuv/libuv.git
GIT_TAG v1.44.2
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}uv_a.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (libuv_src SOURCE_DIR)
ExternalProject_Get_Property (libuv_src BINARY_DIR)
set (libuv_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${libuv_src_SOURCE_DIR}/include)
set_target_properties (libuv1 PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}uv_a.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(libuv1 libuv_src)
file(TO_CMAKE_PATH "${libuv_src_SOURCE_DIR}" libuv_src_SOURCE_DIR)
endif()
add_library (cassandra STATIC IMPORTED GLOBAL)
ExternalProject_Add(cassandra_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/datastax/cpp-driver.git
GIT_TAG 2.16.2
CMAKE_ARGS
-DLIBUV_ROOT_DIR=${BINARY_DIR}
-DLIBUV_LIBARY=${BINARY_DIR}/libuv_a.a
-DLIBUV_INCLUDE_DIR=${SOURCE_DIR}/include
-DCASS_BUILD_STATIC=ON
-DCASS_BUILD_SHARED=OFF
-DOPENSSL_ROOT_DIR=/opt/local/openssl
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}cassandra_static.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (cassandra_src SOURCE_DIR)
ExternalProject_Get_Property (cassandra_src BINARY_DIR)
set (cassandra_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${cassandra_src_SOURCE_DIR}/include)
set_target_properties (cassandra PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}cassandra_static.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(cassandra cassandra_src)
if(NOT libuv1)
ExternalProject_Add_StepDependencies(cassandra_src build libuv1)
target_link_libraries(cassandra INTERFACE libuv1)
else()
target_link_libraries(cassandra INTERFACE ${libuv1})
endif()
if(NOT krb5)
ExternalProject_Add_StepDependencies(cassandra_src build krb5)
target_link_libraries(cassandra INTERFACE krb5)
else()
target_link_libraries(cassandra INTERFACE ${krb5})
endif()
if(NOT zlib)
ExternalProject_Add_StepDependencies(cassandra_src build zlib)
target_link_libraries(cassandra INTERFACE zlib)
else()
target_link_libraries(cassandra INTERFACE ${zlib})
endif()
file(TO_CMAKE_PATH "${cassandra_src_SOURCE_DIR}" cassandra_src_SOURCE_DIR)
target_link_libraries(ripple_libs INTERFACE cassandra)
else()
message("Found system installed cassandra cpp driver")
find_path(cassandra_includes NAMES cassandra.h REQUIRED)
target_link_libraries (ripple_libs INTERFACE ${cassandra})
target_include_directories(ripple_libs INTERFACE ${cassandra_includes})
endif()
exclude_if_included (cassandra)
endif()

View File

@@ -1,18 +0,0 @@
#[===================================================================[
NIH dep: date
the main library is header-only, thus is an INTERFACE lib in CMake.
NOTE: this has been accepted into c++20 so can likely be replaced
when we update to that standard
#]===================================================================]
find_package (date QUIET)
if (NOT TARGET date::date)
FetchContent_Declare(
hh_date_src
GIT_REPOSITORY https://github.com/HowardHinnant/date.git
GIT_TAG fc4cf092f9674f2670fb9177edcdee870399b829
)
FetchContent_MakeAvailable(hh_date_src)
endif ()

View File

@@ -1,392 +0,0 @@
# currently linking to unsecure versions...if we switch, we'll
# need to add ssl as a link dependency to the grpc targets
option (use_secure_grpc "use TLS version of grpc libs." OFF)
if (use_secure_grpc)
set (grpc_suffix "")
else ()
set (grpc_suffix "_unsecure")
endif ()
find_package (gRPC 1.23 CONFIG QUIET)
if (TARGET gRPC::gpr AND NOT local_grpc)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_DEBUG)
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION)
endif ()
message (STATUS "Found cmake config for gRPC. Using ${_grpc_l}.")
else ()
find_package (PkgConfig QUIET)
if (PKG_CONFIG_FOUND)
pkg_check_modules (grpc QUIET "grpc${grpc_suffix}>=1.25" "grpc++${grpc_suffix}" gpr)
endif ()
if (grpc_FOUND)
message (STATUS "Found gRPC using pkg-config. Using ${grpc_gpr_PREFIX}.")
endif ()
add_executable (gRPC::grpc_cpp_plugin IMPORTED)
exclude_if_included (gRPC::grpc_cpp_plugin)
if (grpc_FOUND AND NOT local_grpc)
# use installed grpc (via pkg-config)
macro (add_imported_grpc libname_)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}${libname_}${CMAKE_STATIC_LIBRARY_SUFFIX}")
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}${libname_}${CMAKE_SHARED_LIBRARY_SUFFIX}")
endif()
find_library(_found_${libname_}
NAMES ${_search}
HINTS ${grpc_LIBRARY_DIRS})
if (_found_${libname_})
message (STATUS "importing ${libname_} as ${_found_${libname_}}")
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find ${_search}")
endif ()
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES IMPORTED_LOCATION ${_found_${libname_}})
if (grpc_INCLUDE_DIRS)
set_target_properties ("gRPC::${libname_}" PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${grpc_INCLUDE_DIRS})
endif ()
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_gpr_PREFIX}/bin/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
pkg_check_modules (cares QUIET libcares)
if (cares_FOUND)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}cares${CMAKE_STATIC_LIBRARY_SUFFIX}")
set (_prefix cares_STATIC)
set (_static STATIC)
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}cares${CMAKE_SHARED_LIBRARY_SUFFIX}")
set (_prefix cares)
set (_static)
endif()
find_library(_location NAMES ${_search} HINTS ${cares_LIBRARY_DIRS})
if (NOT _location)
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
if(${_location} MATCHES "\\.a$")
add_library(c-ares::cares STATIC IMPORTED GLOBAL)
else()
add_library(c-ares::cares SHARED IMPORTED GLOBAL)
endif()
set_target_properties (c-ares::cares PROPERTIES
IMPORTED_LOCATION ${_location}
INTERFACE_INCLUDE_DIRECTORIES "${${_prefix}_INCLUDE_DIRS}"
INTERFACE_LINK_OPTIONS "${${_prefix}_LDFLAGS}"
)
exclude_if_included (c-ares::cares)
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
else ()
#[===========================[
c-ares (grpc requires)
#]===========================]
ExternalProject_Add (c-ares_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/c-ares/c-ares.git
GIT_TAG cares-1_15_0
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DCARES_SHARED=OFF
-DCARES_STATIC=ON
-DCARES_STATIC_PIC=ON
-DCARES_INSTALL=ON
-DCARES_MSVC_STATIC_RUNTIME=ON
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
)
exclude_if_included (c-ares_src)
ExternalProject_Get_Property (c-ares_src BINARY_DIR)
set (cares_binary_dir "${BINARY_DIR}")
add_library (c-ares::cares STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (c-ares::cares PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (c-ares::cares c-ares_src)
exclude_if_included (c-ares::cares)
if (NOT has_zlib)
#[===========================[
zlib (grpc requires)
#]===========================]
if (MSVC)
set (zlib_debug_postfix "d") # zlib cmake sets this internally for MSVC, so we really don't have a choice
set (zlib_base "zlibstatic")
else ()
set (zlib_debug_postfix "_d")
set (zlib_base "z")
endif ()
ExternalProject_Add (zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.11
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=${zlib_debug_postfix}
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
)
exclude_if_included (zlib_src)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_binary_dir "${BINARY_DIR}")
add_library (ZLIB::ZLIB STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (ZLIB::ZLIB PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (ZLIB::ZLIB zlib_src)
exclude_if_included (ZLIB::ZLIB)
endif ()
#[===========================[
grpc
#]===========================]
ExternalProject_Add (grpc_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/grpc/grpc.git
GIT_TAG v1.25.0
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_CXX_STANDARD=17
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DgRPC_BUILD_TESTS=OFF
-DgRPC_BENCHMARK_PROVIDER=""
-DgRPC_BUILD_CSHARP_EXT=OFF
-DgRPC_MSVC_STATIC_RUNTIME=ON
-DgRPC_INSTALL=OFF
-DgRPC_CARES_PROVIDER=package
-Dc-ares_DIR=${cares_binary_dir}/_installed_/lib/cmake/c-ares
-DgRPC_SSL_PROVIDER=package
-DOPENSSL_ROOT_DIR=${OPENSSL_ROOT_DIR}
-DgRPC_PROTOBUF_PROVIDER=package
-DProtobuf_USE_STATIC_LIBS=$<IF:$<AND:$<BOOL:${Protobuf_FOUND}>,$<NOT:$<BOOL:${static}>>>,OFF,ON>
-DProtobuf_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:protobuf::libprotobuf,INTERFACE_INCLUDE_DIRECTORIES>,:_:>
-DProtobuf_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_EXECUTABLE=$<TARGET_PROPERTY:protobuf::protoc,IMPORTED_LOCATION>
-DgRPC_ZLIB_PROVIDER=package
$<$<NOT:$<BOOL:${has_zlib}>>:-DZLIB_ROOT=${zlib_binary_dir}/_installed_>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc++${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}address_sorting$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}gpr$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
<BINARY_DIR>
>
LIST_SEPARATOR :_:
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS c-ares_src
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr_d${ep_lib_suffix}
<BINARY_DIR>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
)
if (TARGET protobuf_src)
ExternalProject_Add_StepDependencies(grpc_src build protobuf_src)
endif ()
exclude_if_included (grpc_src)
ExternalProject_Get_Property (grpc_src BINARY_DIR)
ExternalProject_Get_Property (grpc_src SOURCE_DIR)
set (grpc_binary_dir "${BINARY_DIR}")
set (grpc_source_dir "${SOURCE_DIR}")
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (grpc_src)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
macro (add_imported_grpc libname_)
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${grpc_binary_dir}/${ep_lib_prefix}${libname_}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${grpc_binary_dir}/${ep_lib_prefix}${libname_}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${grpc_source_dir}/include)
add_dependencies ("gRPC::${libname_}" grpc_src)
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_binary_dir}/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (gRPC::grpc_cpp_plugin grpc_src)
endif ()
add_imported_grpc (gpr)
add_imported_grpc ("grpc${grpc_suffix}")
add_imported_grpc ("grpc++${grpc_suffix}")
add_imported_grpc (address_sorting)
target_link_libraries ("gRPC::grpc${grpc_suffix}" INTERFACE c-ares::cares gRPC::gpr gRPC::address_sorting ZLIB::ZLIB)
target_link_libraries ("gRPC::grpc++${grpc_suffix}" INTERFACE "gRPC::grpc${grpc_suffix}" gRPC::gpr)
endif ()
#[=================================[
generate protobuf sources for
grpc defs and bundle into a
static lib
#]=================================]
set(output_dir "${CMAKE_BINARY_DIR}/proto_gen_grpc")
set(GRPC_GEN_DIR "${output_dir}/ripple/proto")
file(MAKE_DIRECTORY ${GRPC_GEN_DIR})
set(GRPC_PROTO_SRCS)
set(GRPC_PROTO_HDRS)
set(GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
file(GLOB_RECURSE GRPC_DEFINITION_FILES "${GRPC_PROTO_ROOT}/*.proto")
foreach(file ${GRPC_DEFINITION_FILES})
# /home/user/rippled/src/ripple/proto/org/.../v1/get_ledger.proto
get_filename_component(_abs_file ${file} ABSOLUTE)
# /home/user/rippled/src/ripple/proto/org/.../v1
get_filename_component(_abs_dir ${_abs_file} DIRECTORY)
# get_ledger
get_filename_component(_basename ${file} NAME_WE)
# /home/user/rippled/src/ripple/proto
get_filename_component(_proto_inc ${GRPC_PROTO_ROOT} DIRECTORY) # updir one level
# org/.../v1/get_ledger.proto
file(RELATIVE_PATH _rel_root_file ${_proto_inc} ${_abs_file})
# org/.../v1
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
# src/ripple/proto/org/.../v1
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
set(src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
set(src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
set(hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
set(hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
add_custom_command(
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
COMMAND protobuf::protoc
ARGS --grpc_out=${GRPC_GEN_DIR}
--cpp_out=${GRPC_GEN_DIR}
--plugin=protoc-gen-grpc=$<TARGET_FILE:gRPC::grpc_cpp_plugin>
-I ${_proto_inc} -I ${_rel_dir}
${_abs_file}
DEPENDS ${_abs_file} protobuf::protoc gRPC::grpc_cpp_plugin
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Running gRPC C++ protocol buffer compiler on ${file}"
VERBATIM)
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES
GENERATED TRUE
SKIP_UNITY_BUILD_INCLUSION ON
)
list(APPEND GRPC_PROTO_SRCS ${src_1} ${src_2})
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
endforeach()
target_include_directories(xrpl_core SYSTEM PUBLIC
$<BUILD_INTERFACE:${output_dir}>
$<BUILD_INTERFACE:${output_dir}/ripple/proto>
# The generated sources include headers relative to this path. Fix it later.
$<INSTALL_INTERFACE:include/ripple/proto>
)
target_sources(xrpl_core PRIVATE ${GRPC_PROTO_SRCS})
install(
DIRECTORY ${output_dir}/ripple
DESTINATION include/
FILES_MATCHING PATTERN "*.h"
)
target_link_libraries(xrpl_core PUBLIC
"gRPC::grpc++"
# libgrpc is missing references.
absl::random_random
)
target_compile_options(xrpl_core
PRIVATE
$<$<BOOL:${MSVC}>:-wd4065>
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
PUBLIC
$<$<BOOL:${MSVC}>:-wd4996>
$<$<BOOL:${is_xcode}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
# target_link_libraries (ripple_libs INTERFACE Ripple::grpc_pbufs)
# exclude_if_included (grpc_pbufs)

View File

@@ -0,0 +1,48 @@
include(create_symbolic_link)
# Consider include directory B nested under prefix A:
#
# /path/to/A/then/to/B/...
#
# Call C the relative path from A to B.
# C is what we want to write in `#include` directives:
#
# #include <then/to/B/...>
#
# Examples, all from the `jobqueue` module:
#
# - Library public headers:
# B = /include/xrpl/jobqueue
# A = /include/
# C = xrpl/jobqueue
#
# - Library private headers:
# B = /src/libxrpl/jobqueue
# A = /src/
# C = libxrpl/jobqueue
#
# - Test private headers:
# B = /tests/jobqueue
# A = /
# C = tests/jobqueue
#
# To isolate headers from each other,
# we want to create a symlink Y that points to B,
# within a subdirectory X of the `CMAKE_BINARY_DIR`,
# that has the same relative path C between X and Y,
# and then add X as an include directory of the target,
# sometimes `PUBLIC` and sometimes `PRIVATE`.
# The Cs are all guaranteed to be unique.
# We can guarantee a unique X per target by using
# `${CMAKE_CURRENT_BINARY_DIR}/include/${target}`.
#
# isolate_headers(target A B scope)
function(isolate_headers target A B scope)
file(RELATIVE_PATH C "${A}" "${B}")
set(X "${CMAKE_CURRENT_BINARY_DIR}/modules/${target}")
set(Y "${X}/${C}")
cmake_path(GET Y PARENT_PATH parent)
file(MAKE_DIRECTORY "${parent}")
create_symbolic_link("${B}" "${Y}")
target_include_directories(${target} ${scope} "$<BUILD_INTERFACE:${X}>")
endfunction()

View File

@@ -0,0 +1,24 @@
# Link a library to its modules (see: `add_module`)
# and remove the module sources from the library's sources.
#
# add_module(parent a)
# add_module(parent b)
# target_link_libraries(project.libparent.b PUBLIC project.libparent.a)
# add_library(project.libparent)
# target_link_modules(parent PUBLIC a b)
function(target_link_modules parent scope)
set(library ${PROJECT_NAME}.lib${parent})
foreach(name ${ARGN})
set(module ${library}.${name})
get_target_property(sources ${library} SOURCES)
list(LENGTH sources before)
get_target_property(dupes ${module} SOURCES)
list(LENGTH dupes expected)
list(REMOVE_ITEM sources ${dupes})
list(LENGTH sources after)
math(EXPR actual "${before} - ${after}")
message(STATUS "${module} with ${expected} sources took ${actual} sources from ${library}")
set_target_properties(${library} PROPERTIES SOURCES "${sources}")
target_link_libraries(${library} ${scope} ${module})
endforeach()
endfunction()

View File

@@ -21,22 +21,23 @@ class Xrpl(ConanFile):
'tests': [True, False],
'unity': [True, False],
'xrpld': [True, False],
'with_wasmedge': [True, False],
'tool_requires_b2': [True, False],
}
requires = [
'date/3.0.1',
'date/3.0.3',
'grpc/1.50.1',
'libarchive/3.6.2',
'libarchive/3.7.6',
'nudb/2.0.8',
'openssl/1.1.1u',
'soci/4.0.3',
'openssl/3.6.0',
'soci/4.0.3@xahaud/stable',
'xxhash/0.8.2',
'wasmedge/0.11.2',
'zlib/1.2.13',
'zlib/1.3.1',
]
tool_requires = [
'protobuf/3.21.9',
'protobuf/3.21.12',
]
default_options = {
@@ -49,9 +50,11 @@ class Xrpl(ConanFile):
'static': True,
'tests': False,
'unity': False,
'with_wasmedge': True,
'tool_requires_b2': False,
'xrpld': False,
'date/*:header_only': True,
'date/*:header_only': False,
'grpc/*:shared': False,
'grpc/*:secure': True,
'libarchive/*:shared': False,
@@ -95,15 +98,28 @@ class Xrpl(ConanFile):
match = next(m for m in matches if m)
self.version = match.group(1)
def build_requirements(self):
self.tool_requires('grpc/1.50.1')
# Explicitly require b2 (e.g. for building from source for glibc compatibility)
if self.options.tool_requires_b2:
self.tool_requires('b2/5.3.2')
def configure(self):
if self.settings.compiler == 'apple-clang':
self.options['boost'].visibility = 'global'
self.options['boost/*'].visibility = 'global'
def requirements(self):
self.requires('boost/1.86.0', force=True)
self.requires('lz4/1.9.3', force=True)
# Force boost version for all dependencies to avoid conflicts
self.requires('boost/1.86.0', override=True)
self.requires('lz4/1.10.0', force=True)
self.requires('protobuf/3.21.9', force=True)
self.requires('sqlite3/3.42.0', force=True)
# Force sqlite3 version to avoid conflicts with soci
self.requires('sqlite3/3.47.0', override=True)
# Force our custom snappy build for all dependencies
self.requires('snappy/1.1.10@xahaud/stable', override=True)
if self.options.with_wasmedge:
self.requires('wasmedge/0.11.2@xahaud/stable')
if self.options.jemalloc:
self.requires('jemalloc/5.3.0')
if self.options.rocksdb:

View File

@@ -8,4 +8,4 @@ if [[ "$GITHUB_REPOSITORY" == "" ]]; then
fi
echo "Mounting $(pwd)/io in ubuntu and running unit tests"
docker run --rm -i -v $(pwd):/io -e BUILD_CORES=$BUILD_CORES ubuntu sh -c '/io/release-build/xahaud --unittest-jobs $BUILD_CORES -u'
docker run --rm -i -v $(pwd):/io --platform=linux/amd64 -e BUILD_CORES=$BUILD_CORES ubuntu sh -c '/io/release-build/xahaud --unittest-jobs $BUILD_CORES -u'

View File

@@ -11,11 +11,11 @@ platforms: Linux, macOS, or Windows.
Package ecosystems vary across Linux distributions,
so there is no one set of instructions that will work for every Linux user.
These instructions are written for Ubuntu 22.04.
They are largely copied from the [script][1] used to configure our Docker
They are largely copied from the [script][1] used to configure a Docker
container for continuous integration.
That script handles many more responsibilities.
These instructions are just the bare minimum to build one configuration of
rippled.
xahaud.
You can check that codebase for other Linux distributions and versions.
If you cannot find yours there,
then we hope that these instructions can at least guide you in the right

177
docs/build/install.md vendored
View File

@@ -1,159 +1,30 @@
This document contains instructions for installing rippled.
The APT package manager is common on Debian-based Linux distributions like
Ubuntu,
while the YUM package manager is common on Red Hat-based Linux distributions
like CentOS.
Installing from source is an option for all platforms,
and the only supported option for installing custom builds.
Comprehensive instructions for installing and running xahaud are available on the [https://Xahau.Network](https://xahau.network/docs/infrastructure/installing-xahaud) documentation website.
## Create the Runtime Environment
xahaud can be [built from source](../../BUILD.md) or installed using the binary files available from [https://build.xahau.tech](https://build.xahau.tech/). After obtaining a working xahaud binary, users will need to provide a suitable runtime environment. The following setup can be used for Linux or Docker environments.
## From source
From a source build, you can install rippled and libxrpl using CMake's
`--install` mode:
1. Create or download two configuration files: the main xahaud.cfg configuration file and a second validators-xahau.txt file defining which validators or UNL list publishers are trusted. The default location for these files in this xahaud repository is `cfg/`.
2. Provide a directory structure that is congruent with the contents of xahaud.cfg. This will include a location for logfiles, such as `/var/log/xahaud/`, as well as database files, `/opt/xahaud/db/`. Configuration files are, by default, sourced from `/etc/xahaud/`. It is possible to provide a symbolic link, if users wish to store configuration files elsewhere.
3. If desired, created a xahaud user and group, and change ownership of the binary and directories. Servers used for validating nodes should use the most restrictive permissions possible for `xahaud.cfg`, as the validation token is stored therein.
4. If desired, create a systemd service file: `/etc/systemd/system/xahaud.service`, enabling xahaud to run as a daemon. Alternately, run: `/path/to/binary/xahaud --conf=/path/to/xahaud.cfg`.
## Example systemd Service File
```
cmake --install . --prefix /opt/local
[Unit]
Description=Xahaud Daemon
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
ExecStart=/path/to/xahaud --silent --conf /path/to/xahaud.cfg
Restart=on-failure
User=xahaud
Group=xahaud
LimitNOFILE=65536
[Install]
WantedBy=multi-user.target
```
The default [prefix][1] is typically `/usr/local` on Linux and macOS and
`C:/Program Files/rippled` on Windows.
[1]: https://cmake.org/cmake/help/latest/variable/CMAKE_INSTALL_PREFIX.html
## With the APT package manager
1. Update repositories:
sudo apt update -y
2. Install utilities:
sudo apt install -y apt-transport-https ca-certificates wget gnupg
3. Add Ripple's package-signing GPG key to your list of trusted keys:
sudo mkdir /usr/local/share/keyrings/
wget -q -O - "https://repos.ripple.com/repos/api/gpg/key/public" | gpg --dearmor > ripple-key.gpg
sudo mv ripple-key.gpg /usr/local/share/keyrings
4. Check the fingerprint of the newly-added key:
gpg /usr/local/share/keyrings/ripple-key.gpg
The output should include an entry for Ripple such as the following:
gpg: WARNING: no command supplied. Trying to guess what you mean ...
pub rsa3072 2019-02-14 [SC] [expires: 2026-02-17]
C0010EC205B35A3310DC90DE395F97FFCCAFD9A2
uid TechOps Team at Ripple <techops+rippled@ripple.com>
sub rsa3072 2019-02-14 [E] [expires: 2026-02-17]
In particular, make sure that the fingerprint matches. (In the above example, the fingerprint is on the third line, starting with `C001`.)
4. Add the appropriate Ripple repository for your operating system version:
echo "deb [signed-by=/usr/local/share/keyrings/ripple-key.gpg] https://repos.ripple.com/repos/rippled-deb focal stable" | \
sudo tee -a /etc/apt/sources.list.d/ripple.list
The above example is appropriate for **Ubuntu 20.04 Focal Fossa**. For other operating systems, replace the word `focal` with one of the following:
- `jammy` for **Ubuntu 22.04 Jammy Jellyfish**
- `bionic` for **Ubuntu 18.04 Bionic Beaver**
- `bullseye` for **Debian 11 Bullseye**
- `buster` for **Debian 10 Buster**
If you want access to development or pre-release versions of `rippled`, use one of the following instead of `stable`:
- `unstable` - Pre-release builds ([`release` branch](https://github.com/ripple/rippled/tree/release))
- `nightly` - Experimental/development builds ([`develop` branch](https://github.com/ripple/rippled/tree/develop))
**Warning:** Unstable and nightly builds may be broken at any time. Do not use these builds for production servers.
5. Fetch the Ripple repository.
sudo apt -y update
6. Install the `rippled` software package:
sudo apt -y install rippled
7. Check the status of the `rippled` service:
systemctl status rippled.service
The `rippled` service should start automatically. If not, you can start it manually:
sudo systemctl start rippled.service
8. Optional: allow `rippled` to bind to privileged ports.
This allows you to serve incoming API requests on port 80 or 443. (If you want to do so, you must also update the config file's port settings.)
sudo setcap 'cap_net_bind_service=+ep' /opt/ripple/bin/rippled
## With the YUM package manager
1. Install the Ripple RPM repository:
Choose the appropriate RPM repository for the stability of releases you want:
- `stable` for the latest production release (`master` branch)
- `unstable` for pre-release builds (`release` branch)
- `nightly` for experimental/development builds (`develop` branch)
*Stable*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-stable]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/stable/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/stable/repodata/repomd.xml.key
REPOFILE
*Unstable*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-unstable]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/unstable/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/unstable/repodata/repomd.xml.key
REPOFILE
*Nightly*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-nightly]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/nightly/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/nightly/repodata/repomd.xml.key
REPOFILE
2. Fetch the latest repo updates:
sudo yum -y update
3. Install the new `rippled` package:
sudo yum install -y rippled
4. Configure the `rippled` service to start on boot:
sudo systemctl enable rippled.service
5. Start the `rippled` service:
sudo systemctl start rippled.service
After the systemd service file is installed, it must be loaded with: `systemctl daemon-reload`. xahaud can then be enabled: `systemctl enable --now xahaud`.

1
external/README.md vendored
View File

@@ -4,6 +4,7 @@ The Conan recipes include patches we have not yet pushed upstream.
| Folder | Upstream | Description |
|:----------------|:---------------------------------------------|:------------|
| `antithesis-sdk`| [Project](https://github.com/antithesishq/antithesis-sdk-cpp/) | [Antithesis](https://antithesis.com/docs/using_antithesis/sdk/cpp/overview.html) SDK for C++ |
| `ed25519-donna` | [Project](https://github.com/floodyberry/ed25519-donna) | [Ed25519](http://ed25519.cr.yp.to/) digital signatures |
| `rocksdb` | [Recipe](https://github.com/conan-io/conan-center-index/tree/master/recipes/rocksdb) | Fast key/value database. (Supports rotational disks better than NuDB.) |
| `secp256k1` | [Project](https://github.com/bitcoin-core/secp256k1) | ECDSA digital signatures using the **secp256k1** curve |

3
external/antithesis-sdk/.clang-format vendored Normal file
View File

@@ -0,0 +1,3 @@
---
DisableFormat: true
SortIncludes: false

17
external/antithesis-sdk/CMakeLists.txt vendored Normal file
View File

@@ -0,0 +1,17 @@
cmake_minimum_required(VERSION 3.25)
# Note, version set explicitly by rippled project
project(antithesis-sdk-cpp VERSION 0.4.4 LANGUAGES CXX)
add_library(antithesis-sdk-cpp INTERFACE antithesis_sdk.h)
# Note, both sections below created by rippled project
target_include_directories(antithesis-sdk-cpp INTERFACE
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>
)
install(
FILES antithesis_sdk.h
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}"
)

BIN
external/antithesis-sdk/LICENSE vendored Normal file

Binary file not shown.

8
external/antithesis-sdk/README.md vendored Normal file
View File

@@ -0,0 +1,8 @@
# Antithesis C++ SDK
This library provides methods for C++ programs to configure the [Antithesis](https://antithesis.com) platform. It contains three kinds of functionality:
* Assertion macros that allow you to define test properties about your software or workload.
* Randomness functions for requesting both structured and unstructured randomness from the Antithesis platform.
* Lifecycle functions that inform the Antithesis environment that particular test phases or milestones have been reached.
For general usage guidance see the [Antithesis C++ SDK Documentation](https://antithesis.com/docs/using_antithesis/sdk/cpp/overview/)

View File

@@ -0,0 +1,113 @@
#pragma once
/*
This header file enables code coverage instrumentation. It is distributed with the Antithesis C++ SDK.
This header file can be used in both C and C++ programs. (The rest of the SDK works only for C++ programs.)
You should include it in a single .cpp or .c file.
The instructions (such as required compiler flags) and usage guidance are found at https://antithesis.com/docs/using_antithesis/sdk/cpp/overview/.
*/
#include <unistd.h>
#include <string.h>
#include <dlfcn.h>
#include <stdint.h>
#include <stdio.h>
#ifndef __cplusplus
#include <stdbool.h>
#include <stddef.h>
#endif
// If the libvoidstar(determ) library is present,
// pass thru trace_pc_guard related callbacks to it
typedef void (*trace_pc_guard_init_fn)(uint32_t *start, uint32_t *stop);
typedef void (*trace_pc_guard_fn)(uint32_t *guard, uint64_t edge);
static trace_pc_guard_init_fn trace_pc_guard_init = NULL;
static trace_pc_guard_fn trace_pc_guard = NULL;
static bool did_check_libvoidstar = false;
static bool has_libvoidstar = false;
static __attribute__((no_sanitize("coverage"))) void debug_message_out(const char *msg) {
(void)printf("%s\n", msg);
return;
}
extern
#ifdef __cplusplus
"C"
#endif
__attribute__((no_sanitize("coverage"))) void antithesis_load_libvoidstar() {
#ifdef __cplusplus
constexpr
#endif
const char* LIB_PATH = "/usr/lib/libvoidstar.so";
if (did_check_libvoidstar) {
return;
}
debug_message_out("TRYING TO LOAD libvoidstar");
did_check_libvoidstar = true;
void* shared_lib = dlopen(LIB_PATH, RTLD_NOW);
if (!shared_lib) {
debug_message_out("Can not load the Antithesis native library");
return;
}
void* trace_pc_guard_init_sym = dlsym(shared_lib, "__sanitizer_cov_trace_pc_guard_init");
if (!trace_pc_guard_init_sym) {
debug_message_out("Can not forward calls to libvoidstar for __sanitizer_cov_trace_pc_guard_init");
return;
}
void* trace_pc_guard_sym = dlsym(shared_lib, "__sanitizer_cov_trace_pc_guard_internal");
if (!trace_pc_guard_sym) {
debug_message_out("Can not forward calls to libvoidstar for __sanitizer_cov_trace_pc_guard");
return;
}
trace_pc_guard_init = (trace_pc_guard_init_fn)(trace_pc_guard_init_sym);
trace_pc_guard = (trace_pc_guard_fn)(trace_pc_guard_sym);
has_libvoidstar = true;
debug_message_out("LOADED libvoidstar");
}
// The following symbols are indeed reserved identifiers, since we're implementing functions defined
// in the compiler runtime. Not clear how to get Clang on board with that besides narrowly suppressing
// the warning in this case. The sample code on the CoverageSanitizer documentation page fails this
// warning!
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wreserved-identifier"
extern
#ifdef __cplusplus
"C"
#endif
void __sanitizer_cov_trace_pc_guard_init(uint32_t *start, uint32_t *stop) {
debug_message_out("SDK forwarding to libvoidstar for __sanitizer_cov_trace_pc_guard_init()");
if (!did_check_libvoidstar) {
antithesis_load_libvoidstar();
}
if (has_libvoidstar) {
trace_pc_guard_init(start, stop);
}
return;
}
extern
#ifdef __cplusplus
"C"
#endif
void __sanitizer_cov_trace_pc_guard( uint32_t *guard ) {
if (has_libvoidstar) {
uint64_t edge = (uint64_t)(__builtin_return_address(0));
trace_pc_guard(guard, edge);
} else {
if (guard) {
*guard = 0;
}
}
return;
}
#pragma clang diagnostic pop

1105
external/antithesis-sdk/antithesis_sdk.h vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -89,13 +89,13 @@ class RocksDBConan(ConanFile):
if self.options.with_snappy:
self.requires("snappy/1.1.10")
if self.options.with_lz4:
self.requires("lz4/1.9.4")
self.requires("lz4/1.10.0")
if self.options.with_zlib:
self.requires("zlib/[>=1.2.11 <2]")
if self.options.with_zstd:
self.requires("zstd/1.5.5")
self.requires("zstd/1.5.6")
if self.options.get_safe("with_tbb"):
self.requires("onetbb/2021.10.0")
self.requires("onetbb/2021.12.0")
if self.options.with_jemalloc:
self.requires("jemalloc/5.3.0")

View File

@@ -10,8 +10,8 @@ env:
MAKEFLAGS: -j4
BUILD: check
### secp256k1 config
ECMULTWINDOW: auto
ECMULTGENPRECISION: auto
ECMULTWINDOW: 15
ECMULTGENKB: 22
ASM: no
WIDEMUL: auto
WITH_VALGRIND: yes
@@ -20,20 +20,18 @@ env:
EXPERIMENTAL: no
ECDH: no
RECOVERY: no
EXTRAKEYS: no
SCHNORRSIG: no
MUSIG: no
ELLSWIFT: no
### test options
SECP256K1_TEST_ITERS:
SECP256K1_TEST_ITERS: 64
BENCH: yes
SECP256K1_BENCH_ITERS: 2
CTIMETESTS: yes
# Compile and run the tests
EXAMPLES: yes
# https://cirrus-ci.org/pricing/#compute-credits
credits_snippet: &CREDITS
# Don't use any credits for now.
use_compute_credits: false
cat_logs_snippet: &CAT_LOGS
always:
cat_tests_log_script:
@@ -53,357 +51,51 @@ cat_logs_snippet: &CAT_LOGS
cat_ci_env_script:
- env
merge_base_script_snippet: &MERGE_BASE
merge_base_script:
- if [ "$CIRRUS_PR" = "" ]; then exit 0; fi
- git fetch --depth=1 $CIRRUS_REPO_CLONE_URL "pull/${CIRRUS_PR}/merge"
- git checkout FETCH_HEAD # Use merged changes to detect silent merge conflicts
linux_container_snippet: &LINUX_CONTAINER
container:
dockerfile: ci/linux-debian.Dockerfile
# Reduce number of CPUs to be able to do more builds in parallel.
cpu: 1
# Gives us more CPUs for free if they're available.
greedy: true
# More than enough for our scripts.
memory: 1G
task:
name: "x86_64: Linux (Debian stable)"
<< : *LINUX_CONTAINER
matrix: &ENV_MATRIX
- env: {WIDEMUL: int64, RECOVERY: yes}
- env: {WIDEMUL: int64, ECDH: yes, SCHNORRSIG: yes}
- env: {WIDEMUL: int128}
- env: {WIDEMUL: int128_struct}
- env: {WIDEMUL: int128, RECOVERY: yes, SCHNORRSIG: yes}
- env: {WIDEMUL: int128, ECDH: yes, SCHNORRSIG: yes}
- env: {WIDEMUL: int128, ASM: x86_64}
- env: { RECOVERY: yes, SCHNORRSIG: yes}
- env: {CTIMETESTS: no, RECOVERY: yes, ECDH: yes, SCHNORRSIG: yes, CPPFLAGS: -DVERIFY}
- env: {BUILD: distcheck, WITH_VALGRIND: no, CTIMETESTS: no, BENCH: no}
- env: {CPPFLAGS: -DDETERMINISTIC}
- env: {CFLAGS: -O0, CTIMETESTS: no}
- env: { ECMULTGENPRECISION: 2, ECMULTWINDOW: 2 }
- env: { ECMULTGENPRECISION: 8, ECMULTWINDOW: 4 }
matrix:
- env:
CC: gcc
- env:
CC: clang
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "i686: Linux (Debian stable)"
<< : *LINUX_CONTAINER
env:
HOST: i686-linux-gnu
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
matrix:
- env:
CC: i686-linux-gnu-gcc
- env:
CC: clang --target=i686-pc-linux-gnu -isystem /usr/i686-linux-gnu/include
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "arm64: macOS Ventura"
macos_instance:
image: ghcr.io/cirruslabs/macos-ventura-base:latest
env:
HOMEBREW_NO_AUTO_UPDATE: 1
HOMEBREW_NO_INSTALL_CLEANUP: 1
# Cirrus gives us a fixed number of 4 virtual CPUs. Not that we even have that many jobs at the moment...
MAKEFLAGS: -j5
matrix:
<< : *ENV_MATRIX
env:
ASM: no
WITH_VALGRIND: no
CTIMETESTS: no
matrix:
- env:
CC: gcc
- env:
CC: clang
brew_script:
- brew install automake libtool gcc
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
<< : *CREDITS
task:
name: "s390x (big-endian): Linux (Debian stable, QEMU)"
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: qemu-s390x
SECP256K1_TEST_ITERS: 16
HOST: s390x-linux-gnu
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
<< : *MERGE_BASE
test_script:
# https://sourceware.org/bugzilla/show_bug.cgi?id=27008
- rm /etc/ld.so.cache
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "ARM32: Linux (Debian stable, QEMU)"
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: qemu-arm
SECP256K1_TEST_ITERS: 16
HOST: arm-linux-gnueabihf
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
matrix:
- env: {}
- env: {EXPERIMENTAL: yes, ASM: arm32}
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "ARM64: Linux (Debian stable, QEMU)"
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: qemu-aarch64
SECP256K1_TEST_ITERS: 16
HOST: aarch64-linux-gnu
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "ppc64le: Linux (Debian stable, QEMU)"
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: qemu-ppc64le
SECP256K1_TEST_ITERS: 16
HOST: powerpc64le-linux-gnu
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: wine
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
matrix:
- name: "x86_64 (mingw32-w64): Windows (Debian stable, Wine)"
env:
HOST: x86_64-w64-mingw32
- name: "i686 (mingw32-w64): Windows (Debian stable, Wine)"
env:
HOST: i686-w64-mingw32
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
<< : *LINUX_CONTAINER
env:
WRAPPER_CMD: wine
WERROR_CFLAGS: -WX
WITH_VALGRIND: no
ECDH: yes
RECOVERY: yes
EXPERIMENTAL: yes
SCHNORRSIG: yes
CTIMETESTS: no
# Use a MinGW-w64 host to tell ./configure we're building for Windows.
# This will detect some MinGW-w64 tools but then make will need only
# the MSVC tools CC, AR and NM as specified below.
HOST: x86_64-w64-mingw32
CC: /opt/msvc/bin/x64/cl
AR: /opt/msvc/bin/x64/lib
NM: /opt/msvc/bin/x64/dumpbin -symbols -headers
# Set non-essential options that affect the CLI messages here.
# (They depend on the user's taste, so we don't want to set them automatically in configure.ac.)
CFLAGS: -nologo -diagnostics:caret
LDFLAGS: -Xlinker -Xlinker -Xlinker -nologo
matrix:
- name: "x86_64 (MSVC): Windows (Debian stable, Wine)"
- name: "x86_64 (MSVC): Windows (Debian stable, Wine, int128_struct)"
env:
WIDEMUL: int128_struct
- name: "x86_64 (MSVC): Windows (Debian stable, Wine, int128_struct with __(u)mulh)"
env:
WIDEMUL: int128_struct
CPPFLAGS: -DSECP256K1_MSVC_MULH_TEST_OVERRIDE
- name: "i686 (MSVC): Windows (Debian stable, Wine)"
env:
HOST: i686-w64-mingw32
CC: /opt/msvc/bin/x86/cl
AR: /opt/msvc/bin/x86/lib
NM: /opt/msvc/bin/x86/dumpbin -symbols -headers
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
# Sanitizers
task:
<< : *LINUX_CONTAINER
env:
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: no
matrix:
- name: "Valgrind (memcheck)"
container:
cpu: 2
env:
# The `--error-exitcode` is required to make the test fail if valgrind found errors, otherwise it'll return 0 (https://www.valgrind.org/docs/manual/manual-core.html)
WRAPPER_CMD: "valgrind --error-exitcode=42"
SECP256K1_TEST_ITERS: 2
- name: "UBSan, ASan, LSan"
container:
memory: 2G
env:
CFLAGS: "-fsanitize=undefined,address -g"
UBSAN_OPTIONS: "print_stacktrace=1:halt_on_error=1"
ASAN_OPTIONS: "strict_string_checks=1:detect_stack_use_after_return=1:detect_leaks=1"
LSAN_OPTIONS: "use_unaligned=1"
SECP256K1_TEST_ITERS: 32
# Try to cover many configurations with just a tiny matrix.
matrix:
- env:
ASM: auto
- env:
ASM: no
ECMULTGENPRECISION: 2
ECMULTWINDOW: 2
matrix:
- env:
CC: clang
- env:
HOST: i686-linux-gnu
CC: i686-linux-gnu-gcc
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
# Memory sanitizers
task:
<< : *LINUX_CONTAINER
name: "MSan"
env:
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
CTIMETESTS: yes
CC: clang
SECP256K1_TEST_ITERS: 32
ASM: no
WITH_VALGRIND: no
container:
memory: 2G
matrix:
- env:
CFLAGS: "-fsanitize=memory -g"
- env:
ECMULTGENPRECISION: 2
ECMULTWINDOW: 2
CFLAGS: "-fsanitize=memory -g -O3"
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "C++ -fpermissive (entire project)"
<< : *LINUX_CONTAINER
env:
CC: g++
CFLAGS: -fpermissive -g
CPPFLAGS: -DSECP256K1_CPLUSPLUS_TEST_OVERRIDE
WERROR_CFLAGS:
ECDH: yes
RECOVERY: yes
SCHNORRSIG: yes
<< : *MERGE_BASE
test_script:
- ./ci/cirrus.sh
<< : *CAT_LOGS
task:
name: "C++ (public headers)"
<< : *LINUX_CONTAINER
test_script:
- g++ -Werror include/*.h
- clang -Werror -x c++-header include/*.h
- /opt/msvc/bin/x64/cl.exe -c -WX -TP include/*.h
task:
name: "sage prover"
<< : *LINUX_CONTAINER
test_script:
- cd sage
- sage prove_group_implementations.sage
task:
name: "x86_64: Windows (VS 2022)"
windows_container:
image: cirrusci/windowsservercore:visualstudio2022
cpu: 4
memory: 3840MB
env:
PATH: '%CIRRUS_WORKING_DIR%\build\src\RelWithDebInfo;%PATH%'
x64_NATIVE_TOOLS: '"C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\Build\vcvars64.bat"'
# Ignore MSBuild warning MSB8029.
# See: https://learn.microsoft.com/en-us/visualstudio/msbuild/errors/msb8029?view=vs-2022
IgnoreWarnIntDirInTempDetected: 'true'
merge_script:
- PowerShell -NoLogo -Command if ($env:CIRRUS_PR -ne $null) { git fetch $env:CIRRUS_REPO_CLONE_URL pull/$env:CIRRUS_PR/merge; git reset --hard FETCH_HEAD; }
configure_script:
- '%x64_NATIVE_TOOLS%'
- cmake -E env CFLAGS="/WX" cmake -G "Visual Studio 17 2022" -A x64 -S . -B build -DSECP256K1_ENABLE_MODULE_RECOVERY=ON -DSECP256K1_BUILD_EXAMPLES=ON
linux_arm64_container_snippet: &LINUX_ARM64_CONTAINER
env_script:
- env | tee /tmp/env
build_script:
- '%x64_NATIVE_TOOLS%'
- cmake --build build --config RelWithDebInfo -- -property:UseMultiToolTask=true;CL_MPcount=5
check_script:
- '%x64_NATIVE_TOOLS%'
- ctest -C RelWithDebInfo --test-dir build -j 5
- build\src\RelWithDebInfo\bench_ecmult.exe
- build\src\RelWithDebInfo\bench_internal.exe
- build\src\RelWithDebInfo\bench.exe
- DOCKER_BUILDKIT=1 docker build --file "ci/linux-debian.Dockerfile" --tag="ci_secp256k1_arm"
- docker image prune --force # Cleanup stale layers
test_script:
- docker run --rm --mount "type=bind,src=./,dst=/ci_secp256k1" --env-file /tmp/env --replace --name "ci_secp256k1_arm" "ci_secp256k1_arm" bash -c "cd /ci_secp256k1/ && ./ci/ci.sh"
task:
name: "ARM64: Linux (Debian stable)"
persistent_worker:
labels:
type: arm64
env:
ECDH: yes
RECOVERY: yes
EXTRAKEYS: yes
SCHNORRSIG: yes
MUSIG: yes
ELLSWIFT: yes
matrix:
# Currently only gcc-snapshot, the other compilers are tested on GHA with QEMU
- env: { CC: 'gcc-snapshot' }
<< : *LINUX_ARM64_CONTAINER
<< : *CAT_LOGS
task:
name: "ARM64: Linux (Debian stable), Valgrind"
persistent_worker:
labels:
type: arm64
env:
ECDH: yes
RECOVERY: yes
EXTRAKEYS: yes
SCHNORRSIG: yes
MUSIG: yes
ELLSWIFT: yes
WRAPPER_CMD: 'valgrind --error-exitcode=42'
SECP256K1_TEST_ITERS: 2
matrix:
- env: { CC: 'gcc' }
- env: { CC: 'clang' }
- env: { CC: 'gcc-snapshot' }
- env: { CC: 'clang-snapshot' }
<< : *LINUX_ARM64_CONTAINER
<< : *CAT_LOGS

View File

@@ -10,6 +10,8 @@ ctime_tests
ecdh_example
ecdsa_example
schnorr_example
ellswift_example
musig_example
*.exe
*.so
*.a

View File

@@ -5,6 +5,83 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.6.0] - 2024-11-04
#### Added
- New module `musig` implements the MuSig2 multisignature scheme according to the [BIP 327 specification](https://github.com/bitcoin/bips/blob/master/bip-0327.mediawiki). See:
- Header file `include/secp256k1_musig.h` which defines the new API.
- Document `doc/musig.md` for further notes on API usage.
- Usage example `examples/musig.c`.
- New CMake variable `SECP256K1_APPEND_LDFLAGS` for appending linker flags to the build command.
#### Changed
- API functions now use a significantly more robust method to clear secrets from the stack before returning. However, secret clearing remains a best-effort security measure and cannot guarantee complete removal.
- Any type `secp256k1_foo` can now be forward-declared using `typedef struct secp256k1_foo secp256k1_foo;` (or also `struct secp256k1_foo;` in C++).
- Organized CMake build artifacts into dedicated directories (`bin/` for executables, `lib/` for libraries) to improve build output structure and Windows shared library compatibility.
#### Removed
- Removed the `secp256k1_scratch_space` struct and its associated functions `secp256k1_scratch_space_create` and `secp256k1_scratch_space_destroy` because the scratch space was unused in the API.
#### ABI Compatibility
The symbols `secp256k1_scratch_space_create` and `secp256k1_scratch_space_destroy` were removed.
Otherwise, the library maintains backward compatibility with versions 0.3.x through 0.5.x.
## [0.5.1] - 2024-08-01
#### Added
- Added usage example for an ElligatorSwift key exchange.
#### Changed
- The default size of the precomputed table for signing was changed from 22 KiB to 86 KiB. The size can be changed with the configure option `--ecmult-gen-kb` (`SECP256K1_ECMULT_GEN_KB` for CMake).
- "auto" is no longer an accepted value for the `--with-ecmult-window` and `--with-ecmult-gen-kb` configure options (this also applies to `SECP256K1_ECMULT_WINDOW_SIZE` and `SECP256K1_ECMULT_GEN_KB` in CMake). To achieve the same configuration as previously provided by the "auto" value, omit setting the configure option explicitly.
#### Fixed
- Fixed compilation when the extrakeys module is disabled.
#### ABI Compatibility
The ABI is backward compatible with versions 0.5.0, 0.4.x and 0.3.x.
## [0.5.0] - 2024-05-06
#### Added
- New function `secp256k1_ec_pubkey_sort` that sorts public keys using lexicographic (of compressed serialization) order.
#### Changed
- The implementation of the point multiplication algorithm used for signing and public key generation was changed, resulting in improved performance for those operations.
- The related configure option `--ecmult-gen-precision` was replaced with `--ecmult-gen-kb` (`SECP256K1_ECMULT_GEN_KB` for CMake).
- This changes the supported precomputed table sizes for these operations. The new supported sizes are 2 KiB, 22 KiB, or 86 KiB (while the old supported sizes were 32 KiB, 64 KiB, or 512 KiB).
#### ABI Compatibility
The ABI is backward compatible with versions 0.4.x and 0.3.x.
## [0.4.1] - 2023-12-21
#### Changed
- The point multiplication algorithm used for ECDH operations (module `ecdh`) was replaced with a slightly faster one.
- Optional handwritten x86_64 assembly for field operations was removed because modern C compilers are able to output more efficient assembly. This change results in a significant speedup of some library functions when handwritten x86_64 assembly is enabled (`--with-asm=x86_64` in GNU Autotools, `-DSECP256K1_ASM=x86_64` in CMake), which is the default on x86_64. Benchmarks with GCC 10.5.0 show a 10% speedup for `secp256k1_ecdsa_verify` and `secp256k1_schnorrsig_verify`.
#### ABI Compatibility
The ABI is backward compatible with versions 0.4.0 and 0.3.x.
## [0.4.0] - 2023-09-04
#### Added
- New module `ellswift` implements ElligatorSwift encoding for public keys and x-only Diffie-Hellman key exchange for them.
ElligatorSwift permits representing secp256k1 public keys as 64-byte arrays which cannot be distinguished from uniformly random. See:
- Header file `include/secp256k1_ellswift.h` which defines the new API.
- Document `doc/ellswift.md` which explains the mathematical background of the scheme.
- The [paper](https://eprint.iacr.org/2022/759) on which the scheme is based.
- We now test the library with unreleased development snapshots of GCC and Clang. This gives us an early chance to catch miscompilations and constant-time issues introduced by the compiler (such as those that led to the previous two releases).
#### Fixed
- Fixed symbol visibility in Windows DLL builds, where three internal library symbols were wrongly exported.
#### Changed
- When consuming libsecp256k1 as a static library on Windows, the user must now define the `SECP256K1_STATIC` macro before including `secp256k1.h`.
#### ABI Compatibility
This release is backward compatible with the ABI of 0.3.0, 0.3.1, and 0.3.2. Symbol visibility is now believed to be handled properly on supported platforms and is now considered to be part of the ABI. Please report any improperly exported symbols as a bug.
## [0.3.2] - 2023-05-13
We strongly recommend updating to 0.3.2 if you use or plan to use GCC >=13 to compile libsecp256k1. When in doubt, check the GCC version using `gcc -v`.
@@ -85,7 +162,11 @@ This version was in fact never released.
The number was given by the build system since the introduction of autotools in Jan 2014 (ea0fe5a5bf0c04f9cc955b2966b614f5f378c6f6).
Therefore, this version number does not uniquely identify a set of source files.
[unreleased]: https://github.com/bitcoin-core/secp256k1/compare/v0.3.2...HEAD
[0.6.0]: https://github.com/bitcoin-core/secp256k1/compare/v0.5.1...v0.6.0
[0.5.1]: https://github.com/bitcoin-core/secp256k1/compare/v0.5.0...v0.5.1
[0.5.0]: https://github.com/bitcoin-core/secp256k1/compare/v0.4.1...v0.5.0
[0.4.1]: https://github.com/bitcoin-core/secp256k1/compare/v0.4.0...v0.4.1
[0.4.0]: https://github.com/bitcoin-core/secp256k1/compare/v0.3.2...v0.4.0
[0.3.2]: https://github.com/bitcoin-core/secp256k1/compare/v0.3.1...v0.3.2
[0.3.1]: https://github.com/bitcoin-core/secp256k1/compare/v0.3.0...v0.3.1
[0.3.0]: https://github.com/bitcoin-core/secp256k1/compare/v0.2.0...v0.3.0

View File

@@ -1,32 +1,29 @@
cmake_minimum_required(VERSION 3.13)
if(CMAKE_VERSION VERSION_GREATER_EQUAL 3.15)
# MSVC runtime library flags are selected by the CMAKE_MSVC_RUNTIME_LIBRARY abstraction.
cmake_policy(SET CMP0091 NEW)
# MSVC warning flags are not in CMAKE_<LANG>_FLAGS by default.
cmake_policy(SET CMP0092 NEW)
endif()
cmake_minimum_required(VERSION 3.16)
#=============================
# Project / Package metadata
#=============================
project(libsecp256k1
# The package (a.k.a. release) version is based on semantic versioning 2.0.0 of
# the API. All changes in experimental modules are treated as
# backwards-compatible and therefore at most increase the minor version.
VERSION 0.3.2
VERSION 0.6.0
DESCRIPTION "Optimized C library for ECDSA signatures and secret/public key operations on curve secp256k1."
HOMEPAGE_URL "https://github.com/bitcoin-core/secp256k1"
LANGUAGES C
)
enable_testing()
list(APPEND CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake)
if(CMAKE_VERSION VERSION_LESS 3.21)
get_directory_property(parent_directory PARENT_DIRECTORY)
if(parent_directory)
set(PROJECT_IS_TOP_LEVEL OFF CACHE INTERNAL "Emulates CMake 3.21+ behavior.")
set(${PROJECT_NAME}_IS_TOP_LEVEL OFF CACHE INTERNAL "Emulates CMake 3.21+ behavior.")
# Emulates CMake 3.21+ behavior.
if(CMAKE_SOURCE_DIR STREQUAL CMAKE_CURRENT_SOURCE_DIR)
set(PROJECT_IS_TOP_LEVEL ON)
set(${PROJECT_NAME}_IS_TOP_LEVEL ON)
else()
set(PROJECT_IS_TOP_LEVEL ON CACHE INTERNAL "Emulates CMake 3.21+ behavior.")
set(${PROJECT_NAME}_IS_TOP_LEVEL ON CACHE INTERNAL "Emulates CMake 3.21+ behavior.")
set(PROJECT_IS_TOP_LEVEL OFF)
set(${PROJECT_NAME}_IS_TOP_LEVEL OFF)
endif()
unset(parent_directory)
endif()
# The library version is based on libtool versioning of the ABI. The set of
@@ -34,15 +31,19 @@ endif()
# https://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html
# All changes in experimental modules are treated as if they don't affect the
# interface and therefore only increase the revision.
set(${PROJECT_NAME}_LIB_VERSION_CURRENT 2)
set(${PROJECT_NAME}_LIB_VERSION_REVISION 2)
set(${PROJECT_NAME}_LIB_VERSION_CURRENT 5)
set(${PROJECT_NAME}_LIB_VERSION_REVISION 0)
set(${PROJECT_NAME}_LIB_VERSION_AGE 0)
#=============================
# Language setup
#=============================
set(CMAKE_C_STANDARD 90)
set(CMAKE_C_EXTENSIONS OFF)
list(APPEND CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake)
#=============================
# Configurable options
#=============================
option(BUILD_SHARED_LIBS "Build shared libraries." ON)
option(SECP256K1_DISABLE_SHARED "Disable shared library. Overrides BUILD_SHARED_LIBS." OFF)
if(SECP256K1_DISABLE_SHARED)
@@ -51,24 +52,49 @@ endif()
option(SECP256K1_INSTALL "Enable installation." ${PROJECT_IS_TOP_LEVEL})
## Modules
# We declare all options before processing them, to make sure we can express
# dependendencies while processing.
option(SECP256K1_ENABLE_MODULE_ECDH "Enable ECDH module." ON)
if(SECP256K1_ENABLE_MODULE_ECDH)
add_compile_definitions(ENABLE_MODULE_ECDH=1)
option(SECP256K1_ENABLE_MODULE_RECOVERY "Enable ECDSA pubkey recovery module." OFF)
option(SECP256K1_ENABLE_MODULE_EXTRAKEYS "Enable extrakeys module." ON)
option(SECP256K1_ENABLE_MODULE_SCHNORRSIG "Enable schnorrsig module." ON)
option(SECP256K1_ENABLE_MODULE_MUSIG "Enable musig module." ON)
option(SECP256K1_ENABLE_MODULE_ELLSWIFT "Enable ElligatorSwift module." ON)
# Processing must be done in a topological sorting of the dependency graph
# (dependent module first).
if(SECP256K1_ENABLE_MODULE_ELLSWIFT)
add_compile_definitions(ENABLE_MODULE_ELLSWIFT=1)
endif()
if(SECP256K1_ENABLE_MODULE_MUSIG)
if(DEFINED SECP256K1_ENABLE_MODULE_SCHNORRSIG AND NOT SECP256K1_ENABLE_MODULE_SCHNORRSIG)
message(FATAL_ERROR "Module dependency error: You have disabled the schnorrsig module explicitly, but it is required by the musig module.")
endif()
set(SECP256K1_ENABLE_MODULE_SCHNORRSIG ON)
add_compile_definitions(ENABLE_MODULE_MUSIG=1)
endif()
if(SECP256K1_ENABLE_MODULE_SCHNORRSIG)
if(DEFINED SECP256K1_ENABLE_MODULE_EXTRAKEYS AND NOT SECP256K1_ENABLE_MODULE_EXTRAKEYS)
message(FATAL_ERROR "Module dependency error: You have disabled the extrakeys module explicitly, but it is required by the schnorrsig module.")
endif()
set(SECP256K1_ENABLE_MODULE_EXTRAKEYS ON)
add_compile_definitions(ENABLE_MODULE_SCHNORRSIG=1)
endif()
if(SECP256K1_ENABLE_MODULE_EXTRAKEYS)
add_compile_definitions(ENABLE_MODULE_EXTRAKEYS=1)
endif()
option(SECP256K1_ENABLE_MODULE_RECOVERY "Enable ECDSA pubkey recovery module." OFF)
if(SECP256K1_ENABLE_MODULE_RECOVERY)
add_compile_definitions(ENABLE_MODULE_RECOVERY=1)
endif()
option(SECP256K1_ENABLE_MODULE_EXTRAKEYS "Enable extrakeys module." ON)
option(SECP256K1_ENABLE_MODULE_SCHNORRSIG "Enable schnorrsig module." ON)
if(SECP256K1_ENABLE_MODULE_SCHNORRSIG)
set(SECP256K1_ENABLE_MODULE_EXTRAKEYS ON)
add_compile_definitions(ENABLE_MODULE_SCHNORRSIG=1)
endif()
if(SECP256K1_ENABLE_MODULE_EXTRAKEYS)
add_compile_definitions(ENABLE_MODULE_EXTRAKEYS=1)
if(SECP256K1_ENABLE_MODULE_ECDH)
add_compile_definitions(ENABLE_MODULE_ECDH=1)
endif()
option(SECP256K1_USE_EXTERNAL_DEFAULT_CALLBACKS "Enable external default callback functions." OFF)
@@ -76,22 +102,25 @@ if(SECP256K1_USE_EXTERNAL_DEFAULT_CALLBACKS)
add_compile_definitions(USE_EXTERNAL_DEFAULT_CALLBACKS=1)
endif()
set(SECP256K1_ECMULT_WINDOW_SIZE "AUTO" CACHE STRING "Window size for ecmult precomputation for verification, specified as integer in range [2..24]. \"AUTO\" is a reasonable setting for desktop machines (currently 15). [default=AUTO]")
set_property(CACHE SECP256K1_ECMULT_WINDOW_SIZE PROPERTY STRINGS "AUTO" 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24)
set(SECP256K1_ECMULT_WINDOW_SIZE 15 CACHE STRING "Window size for ecmult precomputation for verification, specified as integer in range [2..24]. The default value is a reasonable setting for desktop machines (currently 15). [default=15]")
set_property(CACHE SECP256K1_ECMULT_WINDOW_SIZE PROPERTY STRINGS 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24)
include(CheckStringOptionValue)
check_string_option_value(SECP256K1_ECMULT_WINDOW_SIZE)
if(SECP256K1_ECMULT_WINDOW_SIZE STREQUAL "AUTO")
set(SECP256K1_ECMULT_WINDOW_SIZE 15)
endif()
add_compile_definitions(ECMULT_WINDOW_SIZE=${SECP256K1_ECMULT_WINDOW_SIZE})
set(SECP256K1_ECMULT_GEN_PREC_BITS "AUTO" CACHE STRING "Precision bits to tune the precomputed table size for signing, specified as integer 2, 4 or 8. \"AUTO\" is a reasonable setting for desktop machines (currently 4). [default=AUTO]")
set_property(CACHE SECP256K1_ECMULT_GEN_PREC_BITS PROPERTY STRINGS "AUTO" 2 4 8)
check_string_option_value(SECP256K1_ECMULT_GEN_PREC_BITS)
if(SECP256K1_ECMULT_GEN_PREC_BITS STREQUAL "AUTO")
set(SECP256K1_ECMULT_GEN_PREC_BITS 4)
set(SECP256K1_ECMULT_GEN_KB 86 CACHE STRING "The size of the precomputed table for signing in multiples of 1024 bytes (on typical platforms). Larger values result in possibly better signing or key generation performance at the cost of a larger table. Valid choices are 2, 22, 86. The default value is a reasonable setting for desktop machines (currently 86). [default=86]")
set_property(CACHE SECP256K1_ECMULT_GEN_KB PROPERTY STRINGS 2 22 86)
check_string_option_value(SECP256K1_ECMULT_GEN_KB)
if(SECP256K1_ECMULT_GEN_KB EQUAL 2)
add_compile_definitions(COMB_BLOCKS=2)
add_compile_definitions(COMB_TEETH=5)
elseif(SECP256K1_ECMULT_GEN_KB EQUAL 22)
add_compile_definitions(COMB_BLOCKS=11)
add_compile_definitions(COMB_TEETH=6)
elseif(SECP256K1_ECMULT_GEN_KB EQUAL 86)
add_compile_definitions(COMB_BLOCKS=43)
add_compile_definitions(COMB_TEETH=6)
endif()
add_compile_definitions(ECMULT_GEN_PREC_BITS=${SECP256K1_ECMULT_GEN_PREC_BITS})
set(SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY "OFF" CACHE STRING "Test-only override of the (autodetected by the C code) \"widemul\" setting. Legal values are: \"OFF\", \"int128_struct\", \"int128\" or \"int64\". [default=OFF]")
set_property(CACHE SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY PROPERTY STRINGS "OFF" "int128_struct" "int128" "int64")
@@ -102,7 +131,7 @@ if(SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY)
endif()
mark_as_advanced(FORCE SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY)
set(SECP256K1_ASM "AUTO" CACHE STRING "Assembly optimizations to use: \"AUTO\", \"OFF\", \"x86_64\" or \"arm32\" (experimental). [default=AUTO]")
set(SECP256K1_ASM "AUTO" CACHE STRING "Assembly to use: \"AUTO\", \"OFF\", \"x86_64\" or \"arm32\" (experimental). [default=AUTO]")
set_property(CACHE SECP256K1_ASM PROPERTY STRINGS "AUTO" "OFF" "x86_64" "arm32")
check_string_option_value(SECP256K1_ASM)
if(SECP256K1_ASM STREQUAL "arm32")
@@ -112,7 +141,7 @@ if(SECP256K1_ASM STREQUAL "arm32")
if(HAVE_ARM32_ASM)
add_compile_definitions(USE_EXTERNAL_ASM=1)
else()
message(FATAL_ERROR "ARM32 assembly optimization requested but not available.")
message(FATAL_ERROR "ARM32 assembly requested but not available.")
endif()
elseif(SECP256K1_ASM)
include(CheckX86_64Assembly)
@@ -123,14 +152,14 @@ elseif(SECP256K1_ASM)
elseif(SECP256K1_ASM STREQUAL "AUTO")
set(SECP256K1_ASM "OFF")
else()
message(FATAL_ERROR "x86_64 assembly optimization requested but not available.")
message(FATAL_ERROR "x86_64 assembly requested but not available.")
endif()
endif()
option(SECP256K1_EXPERIMENTAL "Allow experimental configuration options." OFF)
if(NOT SECP256K1_EXPERIMENTAL)
if(SECP256K1_ASM STREQUAL "arm32")
message(FATAL_ERROR "ARM32 assembly optimization is experimental. Use -DSECP256K1_EXPERIMENTAL=ON to allow.")
message(FATAL_ERROR "ARM32 assembly is experimental. Use -DSECP256K1_EXPERIMENTAL=ON to allow.")
endif()
endif()
@@ -167,7 +196,7 @@ else()
string(REGEX REPLACE "-DNDEBUG[ \t\r\n]*" "" CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE}")
string(REGEX REPLACE "-DNDEBUG[ \t\r\n]*" "" CMAKE_C_FLAGS_MINSIZEREL "${CMAKE_C_FLAGS_MINSIZEREL}")
# Prefer -O2 optimization level. (-O3 is CMake's default for Release for many compilers.)
string(REGEX REPLACE "-O3[ \t\r\n]*" "-O2" CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE}")
string(REGEX REPLACE "-O3( |$)" "-O2\\1" CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE}")
endif()
# Define custom "Coverage" build type.
@@ -189,31 +218,37 @@ mark_as_advanced(
CMAKE_SHARED_LINKER_FLAGS_COVERAGE
)
get_property(is_multi_config GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
set(default_build_type "RelWithDebInfo")
if(is_multi_config)
set(CMAKE_CONFIGURATION_TYPES "${default_build_type}" "Release" "Debug" "MinSizeRel" "Coverage" CACHE STRING
"Supported configuration types."
FORCE
)
else()
set_property(CACHE CMAKE_BUILD_TYPE PROPERTY
STRINGS "${default_build_type}" "Release" "Debug" "MinSizeRel" "Coverage"
)
if(NOT CMAKE_BUILD_TYPE)
message(STATUS "Setting build type to \"${default_build_type}\" as none was specified")
set(CMAKE_BUILD_TYPE "${default_build_type}" CACHE STRING
"Choose the type of build."
if(PROJECT_IS_TOP_LEVEL)
get_property(is_multi_config GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
set(default_build_type "RelWithDebInfo")
if(is_multi_config)
set(CMAKE_CONFIGURATION_TYPES "${default_build_type}" "Release" "Debug" "MinSizeRel" "Coverage" CACHE STRING
"Supported configuration types."
FORCE
)
else()
set_property(CACHE CMAKE_BUILD_TYPE PROPERTY
STRINGS "${default_build_type}" "Release" "Debug" "MinSizeRel" "Coverage"
)
if(NOT CMAKE_BUILD_TYPE)
message(STATUS "Setting build type to \"${default_build_type}\" as none was specified")
set(CMAKE_BUILD_TYPE "${default_build_type}" CACHE STRING
"Choose the type of build."
FORCE
)
endif()
endif()
endif()
include(TryAppendCFlags)
if(MSVC)
# Keep the following commands ordered lexicographically.
try_append_c_flags(/W2) # Moderate warning level.
try_append_c_flags(/W3) # Production quality warning level.
try_append_c_flags(/wd4146) # Disable warning C4146 "unary minus operator applied to unsigned type, result still unsigned".
try_append_c_flags(/wd4244) # Disable warning C4244 "'conversion' conversion from 'type1' to 'type2', possible loss of data".
try_append_c_flags(/wd4267) # Disable warning C4267 "'var' : conversion from 'size_t' to 'type', possible loss of data".
# Eliminate deprecation warnings for the older, less secure functions.
add_compile_definitions(_CRT_SECURE_NO_WARNINGS)
else()
# Keep the following commands ordered lexicographically.
try_append_c_flags(-pedantic)
@@ -234,17 +269,41 @@ endif()
set(CMAKE_C_VISIBILITY_PRESET hidden)
# Ask CTest to create a "check" target (e.g., make check) as alias for the "test" target.
# CTEST_TEST_TARGET_ALIAS is not documented but supposed to be user-facing.
# See: https://gitlab.kitware.com/cmake/cmake/-/commit/816c9d1aa1f2b42d40c81a991b68c96eb12b6d2
set(CTEST_TEST_TARGET_ALIAS check)
include(CTest)
# We do not use CTest's BUILD_TESTING because a single toggle for all tests is too coarse for our needs.
mark_as_advanced(BUILD_TESTING)
if(SECP256K1_BUILD_BENCHMARK OR SECP256K1_BUILD_TESTS OR SECP256K1_BUILD_EXHAUSTIVE_TESTS OR SECP256K1_BUILD_CTIME_TESTS OR SECP256K1_BUILD_EXAMPLES)
enable_testing()
set(print_msan_notice)
if(SECP256K1_BUILD_CTIME_TESTS)
include(CheckMemorySanitizer)
check_memory_sanitizer(msan_enabled)
if(msan_enabled)
try_append_c_flags(-fno-sanitize-memory-param-retval)
set(print_msan_notice YES)
endif()
unset(msan_enabled)
endif()
set(SECP256K1_APPEND_CFLAGS "" CACHE STRING "Compiler flags that are appended to the command line after all other flags added by the build system. This variable is intended for debugging and special builds.")
if(SECP256K1_APPEND_CFLAGS)
# Appending to this low-level rule variable is the only way to
# guarantee that the flags appear at the end of the command line.
string(APPEND CMAKE_C_COMPILE_OBJECT " ${SECP256K1_APPEND_CFLAGS}")
endif()
set(SECP256K1_APPEND_LDFLAGS "" CACHE STRING "Linker flags that are appended to the command line after all other flags added by the build system. This variable is intended for debugging and special builds.")
if(SECP256K1_APPEND_LDFLAGS)
# Appending to this low-level rule variable is the only way to
# guarantee that the flags appear at the end of the command line.
string(APPEND CMAKE_C_CREATE_SHARED_LIBRARY " ${SECP256K1_APPEND_LDFLAGS}")
string(APPEND CMAKE_C_LINK_EXECUTABLE " ${SECP256K1_APPEND_LDFLAGS}")
endif()
if(NOT CMAKE_RUNTIME_OUTPUT_DIRECTORY)
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/bin)
endif()
if(NOT CMAKE_LIBRARY_OUTPUT_DIRECTORY)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/lib)
endif()
if(NOT CMAKE_ARCHIVE_OUTPUT_DIRECTORY)
set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/lib)
endif()
add_subdirectory(src)
if(SECP256K1_BUILD_EXAMPLES)
add_subdirectory(examples)
@@ -266,11 +325,13 @@ message(" ECDH ................................ ${SECP256K1_ENABLE_MODULE_ECDH}
message(" ECDSA pubkey recovery ............... ${SECP256K1_ENABLE_MODULE_RECOVERY}")
message(" extrakeys ........................... ${SECP256K1_ENABLE_MODULE_EXTRAKEYS}")
message(" schnorrsig .......................... ${SECP256K1_ENABLE_MODULE_SCHNORRSIG}")
message(" musig ............................... ${SECP256K1_ENABLE_MODULE_MUSIG}")
message(" ElligatorSwift ...................... ${SECP256K1_ENABLE_MODULE_ELLSWIFT}")
message("Parameters:")
message(" ecmult window size .................. ${SECP256K1_ECMULT_WINDOW_SIZE}")
message(" ecmult gen precision bits ........... ${SECP256K1_ECMULT_GEN_PREC_BITS}")
message(" ecmult gen table size ............... ${SECP256K1_ECMULT_GEN_KB} KiB")
message("Optional features:")
message(" assembly optimization ............... ${SECP256K1_ASM}")
message(" assembly ............................ ${SECP256K1_ASM}")
message(" external callbacks .................. ${SECP256K1_USE_EXTERNAL_DEFAULT_CALLBACKS}")
if(SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY)
message(" wide multiplication (test-only) ..... ${SECP256K1_TEST_OVERRIDE_WIDE_MULTIPLY}")
@@ -297,7 +358,7 @@ message("Valgrind .............................. ${SECP256K1_VALGRIND}")
get_directory_property(definitions COMPILE_DEFINITIONS)
string(REPLACE ";" " " definitions "${definitions}")
message("Preprocessor defined macros ........... ${definitions}")
message("C compiler ............................ ${CMAKE_C_COMPILER}")
message("C compiler ............................ ${CMAKE_C_COMPILER_ID} ${CMAKE_C_COMPILER_VERSION}, ${CMAKE_C_COMPILER}")
message("CFLAGS ................................ ${CMAKE_C_FLAGS}")
get_directory_property(compile_options COMPILE_OPTIONS)
string(REPLACE ";" " " compile_options "${compile_options}")
@@ -320,7 +381,20 @@ else()
message(" - LDFLAGS for executables ............ ${CMAKE_EXE_LINKER_FLAGS_DEBUG}")
message(" - LDFLAGS for shared libraries ....... ${CMAKE_SHARED_LINKER_FLAGS_DEBUG}")
endif()
message("\n")
if(SECP256K1_APPEND_CFLAGS)
message("SECP256K1_APPEND_CFLAGS ............... ${SECP256K1_APPEND_CFLAGS}")
endif()
if(SECP256K1_APPEND_LDFLAGS)
message("SECP256K1_APPEND_LDFLAGS .............. ${SECP256K1_APPEND_LDFLAGS}")
endif()
message("")
if(print_msan_notice)
message(
"Note:\n"
" MemorySanitizer detected, tried to add -fno-sanitize-memory-param-retval to compile options\n"
" to avoid false positives in ctime_tests. Pass -DSECP256K1_BUILD_CTIME_TESTS=OFF to avoid this.\n"
)
endif()
if(SECP256K1_EXPERIMENTAL)
message(
" ******\n"

108
external/secp256k1/CONTRIBUTING.md vendored Normal file
View File

@@ -0,0 +1,108 @@
# Contributing to libsecp256k1
## Scope
libsecp256k1 is a library for elliptic curve cryptography on the curve secp256k1, not a general-purpose cryptography library.
The library primarily serves the needs of the Bitcoin Core project but provides additional functionality for the benefit of the wider Bitcoin ecosystem.
## Adding new functionality or modules
The libsecp256k1 project welcomes contributions in the form of new functionality or modules, provided they are within the project's scope.
It is the responsibility of the contributors to convince the maintainers that the proposed functionality is within the project's scope, high-quality and maintainable.
Contributors are recommended to provide the following in addition to the new code:
* **Specification:**
A specification can help significantly in reviewing the new code as it provides documentation and context.
It may justify various design decisions, give a motivation and outline security goals.
If the specification contains pseudocode, a reference implementation or test vectors, these can be used to compare with the proposed libsecp256k1 code.
* **Security Arguments:**
In addition to a defining the security goals, it should be argued that the new functionality meets these goals.
Depending on the nature of the new functionality, a wide range of security arguments are acceptable, ranging from being "obviously secure" to rigorous proofs of security.
* **Relevance Arguments:**
The relevance of the new functionality for the Bitcoin ecosystem should be argued by outlining clear use cases.
These are not the only factors taken into account when considering to add new functionality.
The proposed new libsecp256k1 code must be of high quality, including API documentation and tests, as well as featuring a misuse-resistant API design.
We recommend reaching out to other contributors (see [Communication Channels](#communication-channels)) and get feedback before implementing new functionality.
## Communication channels
Most communication about libsecp256k1 occurs on the GitHub repository: in issues, pull request or on the discussion board.
Additionally, there is an IRC channel dedicated to libsecp256k1, with biweekly meetings (see channel topic).
The channel is `#secp256k1` on Libera Chat.
The easiest way to participate on IRC is with the web client, [web.libera.chat](https://web.libera.chat/#secp256k1).
Chat history logs can be found at https://gnusha.org/secp256k1/.
## Contributor workflow & peer review
The Contributor Workflow & Peer Review in libsecp256k1 are similar to Bitcoin Core's workflow and review processes described in its [CONTRIBUTING.md](https://github.com/bitcoin/bitcoin/blob/master/CONTRIBUTING.md).
### Coding conventions
In addition, libsecp256k1 tries to maintain the following coding conventions:
* No runtime heap allocation (e.g., no `malloc`) unless explicitly requested by the caller (via `secp256k1_context_create` or `secp256k1_scratch_space_create`, for example). Moreover, it should be possible to use the library without any heap allocations.
* The tests should cover all lines and branches of the library (see [Test coverage](#coverage)).
* Operations involving secret data should be tested for being constant time with respect to the secrets (see [src/ctime_tests.c](src/ctime_tests.c)).
* Local variables containing secret data should be cleared explicitly to try to delete secrets from memory.
* Use `secp256k1_memcmp_var` instead of `memcmp` (see [#823](https://github.com/bitcoin-core/secp256k1/issues/823)).
* As a rule of thumb, the default values for configuration options should target standard desktop machines and align with Bitcoin Core's defaults, and the tests should mostly exercise the default configuration (see [#1549](https://github.com/bitcoin-core/secp256k1/issues/1549#issuecomment-2200559257)).
#### Style conventions
* Commits should be atomic and diffs should be easy to read. For this reason, do not mix any formatting fixes or code moves with actual code changes. Make sure each individual commit is hygienic: that it builds successfully on its own without warnings, errors, regressions, or test failures.
* New code should adhere to the style of existing, in particular surrounding, code. Other than that, we do not enforce strict rules for code formatting.
* The code conforms to C89. Most notably, that means that only `/* ... */` comments are allowed (no `//` line comments). Moreover, any declarations in a `{ ... }` block (e.g., a function) must appear at the beginning of the block before any statements. When you would like to declare a variable in the middle of a block, you can open a new block:
```C
void secp256k_foo(void) {
unsigned int x; /* declaration */
int y = 2*x; /* declaration */
x = 17; /* statement */
{
int a, b; /* declaration */
a = x + y; /* statement */
secp256k_bar(x, &b); /* statement */
}
}
```
* Use `unsigned int` instead of just `unsigned`.
* Use `void *ptr` instead of `void* ptr`.
* Arguments of the publicly-facing API must have a specific order defined in [include/secp256k1.h](include/secp256k1.h).
* User-facing comment lines in headers should be limited to 80 chars if possible.
* All identifiers in file scope should start with `secp256k1_`.
* Avoid trailing whitespace.
### Tests
#### Coverage
This library aims to have full coverage of reachable lines and branches.
To create a test coverage report, configure with `--enable-coverage` (use of GCC is necessary):
$ ./configure --enable-coverage
Run the tests:
$ make check
To create a report, `gcovr` is recommended, as it includes branch coverage reporting:
$ gcovr --exclude 'src/bench*' --print-summary
To create a HTML report with coloured and annotated source code:
$ mkdir -p coverage
$ gcovr --exclude 'src/bench*' --html --html-details -o coverage/coverage.html
#### Exhaustive tests
There are tests of several functions in which a small group replaces secp256k1.
These tests are *exhaustive* since they provide all elements and scalars of the small group as input arguments (see [src/tests_exhaustive.c](src/tests_exhaustive.c)).
### Benchmarks
See `src/bench*.c` for examples of benchmarks.

View File

@@ -37,7 +37,6 @@ noinst_HEADERS += src/field_10x26_impl.h
noinst_HEADERS += src/field_5x52.h
noinst_HEADERS += src/field_5x52_impl.h
noinst_HEADERS += src/field_5x52_int128_impl.h
noinst_HEADERS += src/field_5x52_asm_impl.h
noinst_HEADERS += src/modinv32.h
noinst_HEADERS += src/modinv32_impl.h
noinst_HEADERS += src/modinv64.h
@@ -46,6 +45,7 @@ noinst_HEADERS += src/precomputed_ecmult.h
noinst_HEADERS += src/precomputed_ecmult_gen.h
noinst_HEADERS += src/assumptions.h
noinst_HEADERS += src/checkmem.h
noinst_HEADERS += src/testutil.h
noinst_HEADERS += src/util.h
noinst_HEADERS += src/int128.h
noinst_HEADERS += src/int128_impl.h
@@ -64,6 +64,8 @@ noinst_HEADERS += src/field.h
noinst_HEADERS += src/field_impl.h
noinst_HEADERS += src/bench.h
noinst_HEADERS += src/wycheproof/ecdsa_secp256k1_sha256_bitcoin_test.h
noinst_HEADERS += src/hsort.h
noinst_HEADERS += src/hsort_impl.h
noinst_HEADERS += contrib/lax_der_parsing.h
noinst_HEADERS += contrib/lax_der_parsing.c
noinst_HEADERS += contrib/lax_der_privatekey_parsing.h
@@ -153,7 +155,7 @@ endif
if USE_EXAMPLES
noinst_PROGRAMS += ecdsa_example
ecdsa_example_SOURCES = examples/ecdsa.c
ecdsa_example_CPPFLAGS = -I$(top_srcdir)/include
ecdsa_example_CPPFLAGS = -I$(top_srcdir)/include -DSECP256K1_STATIC
ecdsa_example_LDADD = libsecp256k1.la
ecdsa_example_LDFLAGS = -static
if BUILD_WINDOWS
@@ -163,7 +165,7 @@ TESTS += ecdsa_example
if ENABLE_MODULE_ECDH
noinst_PROGRAMS += ecdh_example
ecdh_example_SOURCES = examples/ecdh.c
ecdh_example_CPPFLAGS = -I$(top_srcdir)/include
ecdh_example_CPPFLAGS = -I$(top_srcdir)/include -DSECP256K1_STATIC
ecdh_example_LDADD = libsecp256k1.la
ecdh_example_LDFLAGS = -static
if BUILD_WINDOWS
@@ -174,7 +176,7 @@ endif
if ENABLE_MODULE_SCHNORRSIG
noinst_PROGRAMS += schnorr_example
schnorr_example_SOURCES = examples/schnorr.c
schnorr_example_CPPFLAGS = -I$(top_srcdir)/include
schnorr_example_CPPFLAGS = -I$(top_srcdir)/include -DSECP256K1_STATIC
schnorr_example_LDADD = libsecp256k1.la
schnorr_example_LDFLAGS = -static
if BUILD_WINDOWS
@@ -182,6 +184,28 @@ schnorr_example_LDFLAGS += -lbcrypt
endif
TESTS += schnorr_example
endif
if ENABLE_MODULE_ELLSWIFT
noinst_PROGRAMS += ellswift_example
ellswift_example_SOURCES = examples/ellswift.c
ellswift_example_CPPFLAGS = -I$(top_srcdir)/include -DSECP256K1_STATIC
ellswift_example_LDADD = libsecp256k1.la
ellswift_example_LDFLAGS = -static
if BUILD_WINDOWS
ellswift_example_LDFLAGS += -lbcrypt
endif
TESTS += ellswift_example
endif
if ENABLE_MODULE_MUSIG
noinst_PROGRAMS += musig_example
musig_example_SOURCES = examples/musig.c
musig_example_CPPFLAGS = -I$(top_srcdir)/include -DSECP256K1_STATIC
musig_example_LDADD = libsecp256k1.la
musig_example_LDFLAGS = -static
if BUILD_WINDOWS
musig_example_LDFLAGS += -lbcrypt
endif
TESTS += musig_example
endif
endif
### Precomputed tables
@@ -189,11 +213,11 @@ EXTRA_PROGRAMS = precompute_ecmult precompute_ecmult_gen
CLEANFILES = $(EXTRA_PROGRAMS)
precompute_ecmult_SOURCES = src/precompute_ecmult.c
precompute_ecmult_CPPFLAGS = $(SECP_CONFIG_DEFINES)
precompute_ecmult_CPPFLAGS = $(SECP_CONFIG_DEFINES) -DVERIFY
precompute_ecmult_LDADD = $(COMMON_LIB)
precompute_ecmult_gen_SOURCES = src/precompute_ecmult_gen.c
precompute_ecmult_gen_CPPFLAGS = $(SECP_CONFIG_DEFINES)
precompute_ecmult_gen_CPPFLAGS = $(SECP_CONFIG_DEFINES) -DVERIFY
precompute_ecmult_gen_LDADD = $(COMMON_LIB)
# See Automake manual, Section "Errors with distclean".
@@ -241,6 +265,7 @@ maintainer-clean-local: clean-testvectors
### Additional files to distribute
EXTRA_DIST = autogen.sh CHANGELOG.md SECURITY.md
EXTRA_DIST += doc/release-process.md doc/safegcd_implementation.md
EXTRA_DIST += doc/ellswift.md doc/musig.md
EXTRA_DIST += examples/EXAMPLES_COPYING
EXTRA_DIST += sage/gen_exhaustive_groups.sage
EXTRA_DIST += sage/gen_split_lambda_constants.sage
@@ -267,3 +292,11 @@ endif
if ENABLE_MODULE_SCHNORRSIG
include src/modules/schnorrsig/Makefile.am.include
endif
if ENABLE_MODULE_MUSIG
include src/modules/musig/Makefile.am.include
endif
if ENABLE_MODULE_ELLSWIFT
include src/modules/ellswift/Makefile.am.include
endif

View File

@@ -1,11 +1,10 @@
libsecp256k1
============
[![Build Status](https://api.cirrus-ci.com/github/bitcoin-core/secp256k1.svg?branch=master)](https://cirrus-ci.com/github/bitcoin-core/secp256k1)
![Dependencies: None](https://img.shields.io/badge/dependencies-none-success)
[![irc.libera.chat #secp256k1](https://img.shields.io/badge/irc.libera.chat-%23secp256k1-success)](https://web.libera.chat/#secp256k1)
Optimized C library for ECDSA signatures and secret/public key operations on curve secp256k1.
High-performance high-assurance C library for digital signatures and other cryptographic primitives on the secp256k1 elliptic curve.
This library is intended to be the highest quality publicly available library for cryptography on the secp256k1 curve. However, the primary focus of its development has been for usage in the Bitcoin system and usage unlike Bitcoin's may be less well tested, verified, or suffer from a less well thought out interface. Correct usage requires some care and consideration that the library is fit for your application's purpose.
@@ -21,6 +20,8 @@ Features:
* Optional module for public key recovery.
* Optional module for ECDH key exchange.
* Optional module for Schnorr signatures according to [BIP-340](https://github.com/bitcoin/bips/blob/master/bip-0340.mediawiki).
* Optional module for ElligatorSwift key exchange according to [BIP-324](https://github.com/bitcoin/bips/blob/master/bip-0324.mediawiki).
* Optional module for MuSig2 Schnorr multi-signatures according to [BIP-327](https://github.com/bitcoin/bips/blob/master/bip-0327.mediawiki).
Implementation details
----------------------
@@ -34,7 +35,7 @@ Implementation details
* Expose only higher level interfaces to minimize the API surface and improve application security. ("Be difficult to use insecurely.")
* Field operations
* Optimized implementation of arithmetic modulo the curve's field size (2^256 - 0x1000003D1).
* Using 5 52-bit limbs (including hand-optimized assembly for x86_64, by Diederik Huys).
* Using 5 52-bit limbs
* Using 10 26-bit limbs (including hand-optimized assembly for 32-bit ARM, by Wladimir J. van der Laan).
* This is an experimental feature that has not received enough scrutiny to satisfy the standard of quality of this library but is made available for testing and review by the community.
* Scalar operations
@@ -80,9 +81,9 @@ To maintain a pristine source tree, CMake encourages to perform an out-of-source
$ mkdir build && cd build
$ cmake ..
$ make
$ make check # run the test suite
$ sudo make install # optional
$ cmake --build .
$ ctest # run the test suite
$ sudo cmake --install . # optional
To compile optional modules (such as Schnorr signatures), you need to run `cmake` with additional flags (such as `-DSECP256K1_ENABLE_MODULE_SCHNORRSIG=ON`). Run `cmake .. -LH` to see the full list of available flags.
@@ -114,31 +115,10 @@ Usage examples can be found in the [examples](examples) directory. To compile th
* [ECDSA example](examples/ecdsa.c)
* [Schnorr signatures example](examples/schnorr.c)
* [Deriving a shared secret (ECDH) example](examples/ecdh.c)
* [ElligatorSwift key exchange example](examples/ellswift.c)
To compile the Schnorr signature and ECDH examples, you also need to configure with `--enable-module-schnorrsig` and `--enable-module-ecdh`.
Test coverage
-----------
This library aims to have full coverage of the reachable lines and branches.
To create a test coverage report, configure with `--enable-coverage` (use of GCC is necessary):
$ ./configure --enable-coverage
Run the tests:
$ make check
To create a report, `gcovr` is recommended, as it includes branch coverage reporting:
$ gcovr --exclude 'src/bench*' --print-summary
To create a HTML report with coloured and annotated source code:
$ mkdir -p coverage
$ gcovr --exclude 'src/bench*' --html --html-details -o coverage/coverage.html
Benchmark
------------
If configured with `--enable-benchmark` (which is the default), binaries for benchmarking the libsecp256k1 functions will be present in the root directory after the build.
@@ -155,3 +135,8 @@ Reporting a vulnerability
------------
See [SECURITY.md](SECURITY.md)
Contributing to libsecp256k1
------------
See [CONTRIBUTING.md](CONTRIBUTING.md)

View File

@@ -45,6 +45,22 @@ fi
AC_MSG_RESULT($has_valgrind)
])
AC_DEFUN([SECP_MSAN_CHECK], [
AC_MSG_CHECKING(whether MemorySanitizer is enabled)
AC_COMPILE_IFELSE([AC_LANG_SOURCE([[
#if defined(__has_feature)
# if __has_feature(memory_sanitizer)
/* MemorySanitizer is enabled. */
# elif
# error "MemorySanitizer is disabled."
# endif
#else
# error "__has_feature is not defined."
#endif
]])], [msan_enabled=yes], [msan_enabled=no])
AC_MSG_RESULT([$msan_enabled])
])
dnl SECP_TRY_APPEND_CFLAGS(flags, VAR)
dnl Append flags to VAR if CC accepts them.
AC_DEFUN([SECP_TRY_APPEND_CFLAGS], [

View File

@@ -4,19 +4,21 @@ set -eux
export LC_ALL=C
# Print relevant CI environment to allow reproducing the job outside of CI.
# Print commit and relevant CI environment to allow reproducing the job outside of CI.
git show --no-patch
print_environment() {
# Turn off -x because it messes up the output
set +x
# There are many ways to print variable names and their content. This one
# does not rely on bash.
for var in WERROR_CFLAGS MAKEFLAGS BUILD \
ECMULTWINDOW ECMULTGENPRECISION ASM WIDEMUL WITH_VALGRIND EXTRAFLAGS \
EXPERIMENTAL ECDH RECOVERY SCHNORRSIG \
ECMULTWINDOW ECMULTGENKB ASM WIDEMUL WITH_VALGRIND EXTRAFLAGS \
EXPERIMENTAL ECDH RECOVERY EXTRAKEYS MUSIG SCHNORRSIG ELLSWIFT \
SECP256K1_TEST_ITERS BENCH SECP256K1_BENCH_ITERS CTIMETESTS\
EXAMPLES \
HOST WRAPPER_CMD \
CC CFLAGS CPPFLAGS AR NM
CC CFLAGS CPPFLAGS AR NM \
UBSAN_OPTIONS ASAN_OPTIONS LSAN_OPTIONS
do
eval "isset=\${$var+x}"
if [ -n "$isset" ]; then
@@ -30,19 +32,15 @@ print_environment() {
}
print_environment
# Start persistent wineserver if necessary.
# This speeds up jobs with many invocations of wine (e.g., ./configure with MSVC) tremendously.
case "$WRAPPER_CMD" in
*wine*)
# Make sure to shutdown wineserver whenever we exit.
trap "wineserver -k || true" EXIT INT HUP
# This is apparently only reliable when we run a dummy command such as "hh.exe" afterwards.
wineserver -p && wine hh.exe
env >> test_env.log
# If gcc is requested, assert that it's in fact gcc (and not some symlinked Apple clang).
case "${CC:-undefined}" in
*gcc*)
$CC -v 2>&1 | grep -q "gcc version" || exit 1;
;;
esac
env >> test_env.log
if [ -n "${CC+x}" ]; then
# The MSVC compiler "cl" doesn't understand "-v"
$CC -v || true
@@ -54,22 +52,55 @@ if [ -n "$WRAPPER_CMD" ]; then
$WRAPPER_CMD --version
fi
# Workaround for https://bugs.kde.org/show_bug.cgi?id=452758 (fixed in valgrind 3.20.0).
case "${CC:-undefined}" in
clang*)
if [ "$CTIMETESTS" = "yes" ] && [ "$WITH_VALGRIND" = "yes" ]
then
export CFLAGS="${CFLAGS:+$CFLAGS }-gdwarf-4"
else
case "$WRAPPER_CMD" in
valgrind*)
export CFLAGS="${CFLAGS:+$CFLAGS }-gdwarf-4"
;;
esac
fi
;;
esac
./autogen.sh
./configure \
--enable-experimental="$EXPERIMENTAL" \
--with-test-override-wide-multiply="$WIDEMUL" --with-asm="$ASM" \
--with-ecmult-window="$ECMULTWINDOW" \
--with-ecmult-gen-precision="$ECMULTGENPRECISION" \
--with-ecmult-gen-kb="$ECMULTGENKB" \
--enable-module-ecdh="$ECDH" --enable-module-recovery="$RECOVERY" \
--enable-module-ellswift="$ELLSWIFT" \
--enable-module-extrakeys="$EXTRAKEYS" \
--enable-module-schnorrsig="$SCHNORRSIG" \
--enable-module-musig="$MUSIG" \
--enable-examples="$EXAMPLES" \
--enable-ctime-tests="$CTIMETESTS" \
--with-valgrind="$WITH_VALGRIND" \
--host="$HOST" $EXTRAFLAGS
# We have set "-j<n>" in MAKEFLAGS.
make
build_exit_code=0
make > make.log 2>&1 || build_exit_code=$?
cat make.log
if [ $build_exit_code -ne 0 ]; then
case "${CC:-undefined}" in
*snapshot*)
# Ignore internal compiler errors in gcc-snapshot and clang-snapshot
grep -e "internal compiler error:" -e "PLEASE submit a bug report" make.log
return $?;
;;
*)
return 1;
;;
esac
fi
# Print information about binaries so that we can see that the architecture is correct
file *tests* || true

View File

@@ -1,4 +1,17 @@
FROM debian:stable
FROM debian:stable-slim
SHELL ["/bin/bash", "-c"]
WORKDIR /root
# A too high maximum number of file descriptors (with the default value
# inherited from the docker host) can cause issues with some of our tools:
# - sanitizers hanging: https://github.com/google/sanitizers/issues/1662
# - valgrind crashing: https://stackoverflow.com/a/75293014
# This is not be a problem on our CI hosts, but developers who run the image
# on their machines may run into this (e.g., on Arch Linux), so warn them.
# (Note that .bashrc is only executed in interactive bash shells.)
RUN echo 'if [[ $(ulimit -n) -gt 200000 ]]; then echo "WARNING: Very high value reported by \"ulimit -n\". Consider passing \"--ulimit nofile=32768\" to \"docker run\"."; fi' >> /root/.bashrc
RUN dpkg --add-architecture i386 && \
dpkg --add-architecture s390x && \
@@ -11,27 +24,56 @@ RUN dpkg --add-architecture i386 && \
RUN apt-get update && apt-get install --no-install-recommends -y \
git ca-certificates \
make automake libtool pkg-config dpkg-dev valgrind qemu-user \
gcc clang llvm libc6-dbg \
gcc clang llvm libclang-rt-dev libc6-dbg \
g++ \
gcc-i686-linux-gnu libc6-dev-i386-cross libc6-dbg:i386 libubsan1:i386 libasan6:i386 \
gcc-i686-linux-gnu libc6-dev-i386-cross libc6-dbg:i386 libubsan1:i386 libasan8:i386 \
gcc-s390x-linux-gnu libc6-dev-s390x-cross libc6-dbg:s390x \
gcc-arm-linux-gnueabihf libc6-dev-armhf-cross libc6-dbg:armhf \
gcc-aarch64-linux-gnu libc6-dev-arm64-cross libc6-dbg:arm64 \
gcc-powerpc64le-linux-gnu libc6-dev-ppc64el-cross libc6-dbg:ppc64el \
gcc-mingw-w64-x86-64-win32 wine64 wine \
gcc-mingw-w64-i686-win32 wine32 \
sagemath
python3 && \
if ! ( dpkg --print-architecture | grep --quiet "arm64" ) ; then \
apt-get install --no-install-recommends -y \
gcc-aarch64-linux-gnu libc6-dev-arm64-cross libc6-dbg:arm64 ;\
fi && \
apt-get clean && rm -rf /var/lib/apt/lists/*
WORKDIR /root
# The "wine" package provides a convience wrapper that we need
RUN apt-get update && apt-get install --no-install-recommends -y \
git ca-certificates wine64 wine python3-simplejson python3-six msitools winbind procps && \
git clone https://github.com/mstorsjo/msvc-wine && \
mkdir /opt/msvc && \
python3 msvc-wine/vsdownload.py --accept-license --dest /opt/msvc Microsoft.VisualStudio.Workload.VCTools && \
msvc-wine/install.sh /opt/msvc
# Build and install gcc snapshot
ARG GCC_SNAPSHOT_MAJOR=15
RUN apt-get update && apt-get install --no-install-recommends -y wget libgmp-dev libmpfr-dev libmpc-dev flex && \
mkdir gcc && cd gcc && \
wget --progress=dot:giga --https-only --recursive --accept '*.tar.xz' --level 1 --no-directories "https://gcc.gnu.org/pub/gcc/snapshots/LATEST-${GCC_SNAPSHOT_MAJOR}" && \
wget "https://gcc.gnu.org/pub/gcc/snapshots/LATEST-${GCC_SNAPSHOT_MAJOR}/sha512.sum" && \
sha512sum --check --ignore-missing sha512.sum && \
# We should have downloaded exactly one tar.xz file
ls && \
[ $(ls *.tar.xz | wc -l) -eq "1" ] && \
tar xf *.tar.xz && \
mkdir gcc-build && cd gcc-build && \
../*/configure --prefix=/opt/gcc-snapshot --enable-languages=c --disable-bootstrap --disable-multilib --without-isl && \
make -j $(nproc) && \
make install && \
cd ../.. && rm -rf gcc && \
ln -s /opt/gcc-snapshot/bin/gcc /usr/bin/gcc-snapshot && \
apt-get autoremove -y wget libgmp-dev libmpfr-dev libmpc-dev flex && \
apt-get clean && rm -rf /var/lib/apt/lists/*
# Install clang snapshot, see https://apt.llvm.org/
RUN \
# Setup GPG keys of LLVM repository
apt-get update && apt-get install --no-install-recommends -y wget && \
wget -qO- https://apt.llvm.org/llvm-snapshot.gpg.key | tee /etc/apt/trusted.gpg.d/apt.llvm.org.asc && \
# Add repository for this Debian release
. /etc/os-release && echo "deb http://apt.llvm.org/${VERSION_CODENAME} llvm-toolchain-${VERSION_CODENAME} main" >> /etc/apt/sources.list && \
apt-get update && \
# Determine the version number of the LLVM development branch
LLVM_VERSION=$(apt-cache search --names-only '^clang-[0-9]+$' | sort -V | tail -1 | cut -f1 -d" " | cut -f2 -d"-" ) && \
# Install
apt-get install --no-install-recommends -y "clang-${LLVM_VERSION}" && \
# Create symlink
ln -s "/usr/bin/clang-${LLVM_VERSION}" /usr/bin/clang-snapshot && \
# Clean up
apt-get autoremove -y wget && \
apt-get clean && rm -rf /var/lib/apt/lists/*
# Initialize the wine environment. Wait until the wineserver process has
# exited before closing the session, to avoid corrupting the wine prefix.
RUN wine64 wineboot --init && \
while (ps -A | grep wineserver) > /dev/null; do sleep 1; done

View File

@@ -1,6 +1,6 @@
function(check_arm32_assembly)
try_compile(HAVE_ARM32_ASM
${CMAKE_BINARY_DIR}/check_arm32_assembly
SOURCES ${CMAKE_SOURCE_DIR}/cmake/source_arm32.s
${PROJECT_BINARY_DIR}/check_arm32_assembly
SOURCES ${PROJECT_SOURCE_DIR}/cmake/source_arm32.s
)
endfunction()

View File

@@ -0,0 +1,18 @@
include_guard(GLOBAL)
include(CheckCSourceCompiles)
function(check_memory_sanitizer output)
set(CMAKE_TRY_COMPILE_TARGET_TYPE STATIC_LIBRARY)
check_c_source_compiles("
#if defined(__has_feature)
# if __has_feature(memory_sanitizer)
/* MemorySanitizer is enabled. */
# elif
# error \"MemorySanitizer is disabled.\"
# endif
#else
# error \"__has_feature is not defined.\"
#endif
" HAVE_MSAN)
set(${output} ${HAVE_MSAN} PARENT_SCOPE)
endfunction()

View File

@@ -0,0 +1,8 @@
function(generate_pkg_config_file in_file)
set(prefix ${CMAKE_INSTALL_PREFIX})
set(exec_prefix \${prefix})
set(libdir \${exec_prefix}/${CMAKE_INSTALL_LIBDIR})
set(includedir \${prefix}/${CMAKE_INSTALL_INCLUDEDIR})
set(PACKAGE_VERSION ${PROJECT_VERSION})
configure_file(${in_file} ${PROJECT_NAME}.pc @ONLY)
endfunction()

View File

@@ -4,8 +4,8 @@ AC_PREREQ([2.60])
# the API. All changes in experimental modules are treated as
# backwards-compatible and therefore at most increase the minor version.
define(_PKG_VERSION_MAJOR, 0)
define(_PKG_VERSION_MINOR, 3)
define(_PKG_VERSION_PATCH, 2)
define(_PKG_VERSION_MINOR, 6)
define(_PKG_VERSION_PATCH, 0)
define(_PKG_VERSION_IS_RELEASE, true)
# The library version is based on libtool versioning of the ABI. The set of
@@ -13,8 +13,8 @@ define(_PKG_VERSION_IS_RELEASE, true)
# https://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html
# All changes in experimental modules are treated as if they don't affect the
# interface and therefore only increase the revision.
define(_LIB_VERSION_CURRENT, 2)
define(_LIB_VERSION_REVISION, 2)
define(_LIB_VERSION_CURRENT, 5)
define(_LIB_VERSION_REVISION, 0)
define(_LIB_VERSION_AGE, 0)
AC_INIT([libsecp256k1],m4_join([.], _PKG_VERSION_MAJOR, _PKG_VERSION_MINOR, _PKG_VERSION_PATCH)m4_if(_PKG_VERSION_IS_RELEASE, [true], [], [-dev]),[https://github.com/bitcoin-core/secp256k1/issues],[libsecp256k1],[https://github.com/bitcoin-core/secp256k1])
@@ -121,13 +121,12 @@ AC_DEFUN([SECP_TRY_APPEND_DEFAULT_CFLAGS], [
# libtool makes the same assumption internally.
# Note that "/opt" and "-opt" are equivalent for MSVC; we use "-opt" because "/opt" looks like a path.
if test x"$GCC" != x"yes" && test x"$build_windows" = x"yes"; then
SECP_TRY_APPEND_CFLAGS([-W2 -wd4146], $1) # Moderate warning level, disable warning C4146 "unary minus operator applied to unsigned type, result still unsigned"
# We pass -ignore:4217 to the MSVC linker to suppress warning 4217 when
# importing variables from a statically linked secp256k1.
# (See the libtool manual, section "Windows DLLs" for background.)
# Unfortunately, libtool tries to be too clever and strips "-Xlinker arg"
# into "arg", so this will be " -Xlinker -ignore:4217" after stripping.
LDFLAGS="-Xlinker -Xlinker -Xlinker -ignore:4217 $LDFLAGS"
SECP_TRY_APPEND_CFLAGS([-W3], $1) # Production quality warning level.
SECP_TRY_APPEND_CFLAGS([-wd4146], $1) # Disable warning C4146 "unary minus operator applied to unsigned type, result still unsigned".
SECP_TRY_APPEND_CFLAGS([-wd4244], $1) # Disable warning C4244 "'conversion' conversion from 'type1' to 'type2', possible loss of data".
SECP_TRY_APPEND_CFLAGS([-wd4267], $1) # Disable warning C4267 "'var' : conversion from 'size_t' to 'type', possible loss of data".
# Eliminate deprecation warnings for the older, less secure functions.
CPPFLAGS="-D_CRT_SECURE_NO_WARNINGS $CPPFLAGS"
fi
])
SECP_TRY_APPEND_DEFAULT_CFLAGS(SECP_CFLAGS)
@@ -185,6 +184,14 @@ AC_ARG_ENABLE(module_schnorrsig,
AS_HELP_STRING([--enable-module-schnorrsig],[enable schnorrsig module [default=yes]]), [],
[SECP_SET_DEFAULT([enable_module_schnorrsig], [yes], [yes])])
AC_ARG_ENABLE(module_musig,
AS_HELP_STRING([--enable-module-musig],[enable MuSig2 module [default=yes]]), [],
[SECP_SET_DEFAULT([enable_module_musig], [yes], [yes])])
AC_ARG_ENABLE(module_ellswift,
AS_HELP_STRING([--enable-module-ellswift],[enable ElligatorSwift module [default=yes]]), [],
[SECP_SET_DEFAULT([enable_module_ellswift], [yes], [yes])])
AC_ARG_ENABLE(external_default_callbacks,
AS_HELP_STRING([--enable-external-default-callbacks],[enable external default callback functions [default=no]]), [],
[SECP_SET_DEFAULT([enable_external_default_callbacks], [no], [no])])
@@ -198,25 +205,24 @@ AC_ARG_ENABLE(external_default_callbacks,
AC_ARG_WITH([test-override-wide-multiply], [] ,[set_widemul=$withval], [set_widemul=auto])
AC_ARG_WITH([asm], [AS_HELP_STRING([--with-asm=x86_64|arm32|no|auto],
[assembly optimizations to use (experimental: arm32) [default=auto]])],[req_asm=$withval], [req_asm=auto])
[assembly to use (experimental: arm32) [default=auto]])],[req_asm=$withval], [req_asm=auto])
AC_ARG_WITH([ecmult-window], [AS_HELP_STRING([--with-ecmult-window=SIZE|auto],
AC_ARG_WITH([ecmult-window], [AS_HELP_STRING([--with-ecmult-window=SIZE],
[window size for ecmult precomputation for verification, specified as integer in range [2..24].]
[Larger values result in possibly better performance at the cost of an exponentially larger precomputed table.]
[The table will store 2^(SIZE-1) * 64 bytes of data but can be larger in memory due to platform-specific padding and alignment.]
[A window size larger than 15 will require you delete the prebuilt precomputed_ecmult.c file so that it can be rebuilt.]
[For very large window sizes, use "make -j 1" to reduce memory use during compilation.]
["auto" is a reasonable setting for desktop machines (currently 15). [default=auto]]
[The default value is a reasonable setting for desktop machines (currently 15). [default=15]]
)],
[req_ecmult_window=$withval], [req_ecmult_window=auto])
[set_ecmult_window=$withval], [set_ecmult_window=15])
AC_ARG_WITH([ecmult-gen-precision], [AS_HELP_STRING([--with-ecmult-gen-precision=2|4|8|auto],
[Precision bits to tune the precomputed table size for signing.]
[The size of the table is 32kB for 2 bits, 64kB for 4 bits, 512kB for 8 bits of precision.]
[A larger table size usually results in possible faster signing.]
["auto" is a reasonable setting for desktop machines (currently 4). [default=auto]]
AC_ARG_WITH([ecmult-gen-kb], [AS_HELP_STRING([--with-ecmult-gen-kb=2|22|86],
[The size of the precomputed table for signing in multiples of 1024 bytes (on typical platforms).]
[Larger values result in possibly better signing/keygeneration performance at the cost of a larger table.]
[The default value is a reasonable setting for desktop machines (currently 86). [default=86]]
)],
[req_ecmult_gen_precision=$withval], [req_ecmult_gen_precision=auto])
[set_ecmult_gen_kb=$withval], [set_ecmult_gen_kb=86])
AC_ARG_WITH([valgrind], [AS_HELP_STRING([--with-valgrind=yes|no|auto],
[Build with extra checks for running inside Valgrind [default=auto]]
@@ -245,6 +251,20 @@ if test x"$enable_ctime_tests" = x"auto"; then
enable_ctime_tests=$enable_valgrind
fi
print_msan_notice=no
if test x"$enable_ctime_tests" = x"yes"; then
SECP_MSAN_CHECK
# MSan on Clang >=16 reports unitialized memory in function parameters and return values, even if
# the uninitalized variable is never actually "used". This is called "eager" checking, and it's
# sounds like good idea for normal use of MSan. However, it yields many false positives in the
# ctime_tests because many return values depend on secret (i.e., "uninitialized") values, and
# we're only interested in detecting branches (which count as "uses") on secret data.
if test x"$msan_enabled" = x"yes"; then
SECP_TRY_APPEND_CFLAGS([-fno-sanitize-memory-param-retval], SECP_CFLAGS)
print_msan_notice=yes
fi
fi
if test x"$enable_coverage" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DCOVERAGE=1"
SECP_CFLAGS="-O0 --coverage $SECP_CFLAGS"
@@ -276,24 +296,24 @@ else
x86_64)
SECP_X86_64_ASM_CHECK
if test x"$has_x86_64_asm" != x"yes"; then
AC_MSG_ERROR([x86_64 assembly optimization requested but not available])
AC_MSG_ERROR([x86_64 assembly requested but not available])
fi
;;
arm32)
SECP_ARM32_ASM_CHECK
if test x"$has_arm32_asm" != x"yes"; then
AC_MSG_ERROR([ARM32 assembly optimization requested but not available])
AC_MSG_ERROR([ARM32 assembly requested but not available])
fi
;;
no)
;;
*)
AC_MSG_ERROR([invalid assembly optimization selection])
AC_MSG_ERROR([invalid assembly selection])
;;
esac
fi
# Select assembly optimization
# Select assembly
enable_external_asm=no
case $set_asm in
@@ -306,7 +326,7 @@ arm32)
no)
;;
*)
AC_MSG_ERROR([invalid assembly optimizations])
AC_MSG_ERROR([invalid assembly selection])
;;
esac
@@ -333,14 +353,7 @@ auto)
;;
esac
# Set ecmult window size
if test x"$req_ecmult_window" = x"auto"; then
set_ecmult_window=15
else
set_ecmult_window=$req_ecmult_window
fi
error_window_size=['window size for ecmult precomputation not an integer in range [2..24] or "auto"']
error_window_size=['window size for ecmult precomputation not an integer in range [2..24]']
case $set_ecmult_window in
''|*[[!0-9]]*)
# no valid integer
@@ -355,19 +368,18 @@ case $set_ecmult_window in
;;
esac
# Set ecmult gen precision
if test x"$req_ecmult_gen_precision" = x"auto"; then
set_ecmult_gen_precision=4
else
set_ecmult_gen_precision=$req_ecmult_gen_precision
fi
case $set_ecmult_gen_precision in
2|4|8)
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DECMULT_GEN_PREC_BITS=$set_ecmult_gen_precision"
case $set_ecmult_gen_kb in
2)
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DCOMB_BLOCKS=2 -DCOMB_TEETH=5"
;;
22)
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DCOMB_BLOCKS=11 -DCOMB_TEETH=6"
;;
86)
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DCOMB_BLOCKS=43 -DCOMB_TEETH=6"
;;
*)
AC_MSG_ERROR(['ecmult gen precision not 2, 4, 8 or "auto"'])
AC_MSG_ERROR(['ecmult gen table size not 2, 22 or 86'])
;;
esac
@@ -384,23 +396,38 @@ SECP_CFLAGS="$SECP_CFLAGS $WERROR_CFLAGS"
### Handle module options
###
if test x"$enable_module_ecdh" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_ECDH=1"
# Processing must be done in a reverse topological sorting of the dependency graph
# (dependent module first).
if test x"$enable_module_ellswift" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_ELLSWIFT=1"
fi
if test x"$enable_module_musig" = x"yes"; then
if test x"$enable_module_schnorrsig" = x"no"; then
AC_MSG_ERROR([Module dependency error: You have disabled the schnorrsig module explicitly, but it is required by the musig module.])
fi
enable_module_schnorrsig=yes
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_MUSIG=1"
fi
if test x"$enable_module_schnorrsig" = x"yes"; then
if test x"$enable_module_extrakeys" = x"no"; then
AC_MSG_ERROR([Module dependency error: You have disabled the extrakeys module explicitly, but it is required by the schnorrsig module.])
fi
enable_module_extrakeys=yes
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_SCHNORRSIG=1"
fi
if test x"$enable_module_extrakeys" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_EXTRAKEYS=1"
fi
if test x"$enable_module_recovery" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_RECOVERY=1"
fi
if test x"$enable_module_schnorrsig" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_SCHNORRSIG=1"
enable_module_extrakeys=yes
fi
# Test if extrakeys is set after the schnorrsig module to allow the schnorrsig
# module to set enable_module_extrakeys=yes
if test x"$enable_module_extrakeys" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_EXTRAKEYS=1"
if test x"$enable_module_ecdh" = x"yes"; then
SECP_CONFIG_DEFINES="$SECP_CONFIG_DEFINES -DENABLE_MODULE_ECDH=1"
fi
if test x"$enable_external_default_callbacks" = x"yes"; then
@@ -411,14 +438,9 @@ fi
### Check for --enable-experimental if necessary
###
if test x"$enable_experimental" = x"yes"; then
AC_MSG_NOTICE([******])
AC_MSG_NOTICE([WARNING: experimental build])
AC_MSG_NOTICE([Experimental features do not have stable APIs or properties, and may not be safe for production use.])
AC_MSG_NOTICE([******])
else
if test x"$enable_experimental" = x"no"; then
if test x"$set_asm" = x"arm32"; then
AC_MSG_ERROR([ARM32 assembly optimization is experimental. Use --enable-experimental to allow.])
AC_MSG_ERROR([ARM32 assembly is experimental. Use --enable-experimental to allow.])
fi
fi
@@ -439,6 +461,8 @@ AM_CONDITIONAL([ENABLE_MODULE_ECDH], [test x"$enable_module_ecdh" = x"yes"])
AM_CONDITIONAL([ENABLE_MODULE_RECOVERY], [test x"$enable_module_recovery" = x"yes"])
AM_CONDITIONAL([ENABLE_MODULE_EXTRAKEYS], [test x"$enable_module_extrakeys" = x"yes"])
AM_CONDITIONAL([ENABLE_MODULE_SCHNORRSIG], [test x"$enable_module_schnorrsig" = x"yes"])
AM_CONDITIONAL([ENABLE_MODULE_MUSIG], [test x"$enable_module_musig" = x"yes"])
AM_CONDITIONAL([ENABLE_MODULE_ELLSWIFT], [test x"$enable_module_ellswift" = x"yes"])
AM_CONDITIONAL([USE_EXTERNAL_ASM], [test x"$enable_external_asm" = x"yes"])
AM_CONDITIONAL([USE_ASM_ARM], [test x"$set_asm" = x"arm32"])
AM_CONDITIONAL([BUILD_WINDOWS], [test "$build_windows" = "yes"])
@@ -460,10 +484,12 @@ echo " module ecdh = $enable_module_ecdh"
echo " module recovery = $enable_module_recovery"
echo " module extrakeys = $enable_module_extrakeys"
echo " module schnorrsig = $enable_module_schnorrsig"
echo " module musig = $enable_module_musig"
echo " module ellswift = $enable_module_ellswift"
echo
echo " asm = $set_asm"
echo " ecmult window size = $set_ecmult_window"
echo " ecmult gen prec. bits = $set_ecmult_gen_precision"
echo " ecmult gen table size = $set_ecmult_gen_kb KiB"
# Hide test-only options unless they're used.
if test x"$set_widemul" != xauto; then
echo " wide multiplication = $set_widemul"
@@ -475,3 +501,17 @@ echo " CPPFLAGS = $CPPFLAGS"
echo " SECP_CFLAGS = $SECP_CFLAGS"
echo " CFLAGS = $CFLAGS"
echo " LDFLAGS = $LDFLAGS"
if test x"$print_msan_notice" = x"yes"; then
echo
echo "Note:"
echo " MemorySanitizer detected, tried to add -fno-sanitize-memory-param-retval to SECP_CFLAGS"
echo " to avoid false positives in ctime_tests. Pass --disable-ctime-tests to avoid this."
fi
if test x"$enable_experimental" = x"yes"; then
echo
echo "WARNING: Experimental build"
echo " Experimental features do not have stable APIs or properties, and may not be safe for"
echo " production use."
fi

View File

@@ -67,8 +67,8 @@ extern "C" {
*
* Returns: 1 when the signature could be parsed, 0 otherwise.
* Args: ctx: a secp256k1 context object
* Out: sig: a pointer to a signature object
* In: input: a pointer to the signature to be parsed
* Out: sig: pointer to a signature object
* In: input: pointer to the signature to be parsed
* inputlen: the length of the array pointed to be input
*
* This function will accept any valid DER encoded signature, even if the

483
external/secp256k1/doc/ellswift.md vendored Normal file
View File

@@ -0,0 +1,483 @@
# ElligatorSwift for secp256k1 explained
In this document we explain how the `ellswift` module implementation is related to the
construction in the
["SwiftEC: Shalluevan de Woestijne Indifferentiable Function To Elliptic Curves"](https://eprint.iacr.org/2022/759)
paper by Jorge Chávez-Saab, Francisco Rodríguez-Henríquez, and Mehdi Tibouchi.
* [1. Introduction](#1-introduction)
* [2. The decoding function](#2-the-decoding-function)
+ [2.1 Decoding for `secp256k1`](#21-decoding-for-secp256k1)
* [3. The encoding function](#3-the-encoding-function)
+ [3.1 Switching to *v, w* coordinates](#31-switching-to-v-w-coordinates)
+ [3.2 Avoiding computing all inverses](#32-avoiding-computing-all-inverses)
+ [3.3 Finding the inverse](#33-finding-the-inverse)
+ [3.4 Dealing with special cases](#34-dealing-with-special-cases)
+ [3.5 Encoding for `secp256k1`](#35-encoding-for-secp256k1)
* [4. Encoding and decoding full *(x, y)* coordinates](#4-encoding-and-decoding-full-x-y-coordinates)
+ [4.1 Full *(x, y)* coordinates for `secp256k1`](#41-full-x-y-coordinates-for-secp256k1)
## 1. Introduction
The `ellswift` module effectively introduces a new 64-byte public key format, with the property
that (uniformly random) public keys can be encoded as 64-byte arrays which are computationally
indistinguishable from uniform byte arrays. The module provides functions to convert public keys
from and to this format, as well as convenience functions for key generation and ECDH that operate
directly on ellswift-encoded keys.
The encoding consists of the concatenation of two (32-byte big endian) encoded field elements $u$
and $t.$ Together they encode an x-coordinate on the curve $x$, or (see further) a full point $(x, y)$ on
the curve.
**Decoding** consists of decoding the field elements $u$ and $t$ (values above the field size $p$
are taken modulo $p$), and then evaluating $F_u(t)$, which for every $u$ and $t$ results in a valid
x-coordinate on the curve. The functions $F_u$ will be defined in [Section 2](#2-the-decoding-function).
**Encoding** a given $x$ coordinate is conceptually done as follows:
* Loop:
* Pick a uniformly random field element $u.$
* Compute the set $L = F_u^{-1}(x)$ of $t$ values for which $F_u(t) = x$, which may have up to *8* elements.
* With probability $1 - \dfrac{\\#L}{8}$, restart the loop.
* Select a uniformly random $t \in L$ and return $(u, t).$
This is the *ElligatorSwift* algorithm, here given for just x-coordinates. An extension to full
$(x, y)$ points will be given in [Section 4](#4-encoding-and-decoding-full-x-y-coordinates).
The algorithm finds a uniformly random $(u, t)$ among (almost all) those
for which $F_u(t) = x.$ Section 3.2 in the paper proves that the number of such encodings for
almost all x-coordinates on the curve (all but at most 39) is close to two times the field size
(specifically, it lies in the range $2q \pm (22\sqrt{q} + O(1))$, where $q$ is the size of the field).
## 2. The decoding function
First some definitions:
* $\mathbb{F}$ is the finite field of size $q$, of characteristic 5 or more, and $q \equiv 1 \mod 3.$
* For `secp256k1`, $q = 2^{256} - 2^{32} - 977$, which satisfies that requirement.
* Let $E$ be the elliptic curve of points $(x, y) \in \mathbb{F}^2$ for which $y^2 = x^3 + ax + b$, with $a$ and $b$
public constants, for which $\Delta_E = -16(4a^3 + 27b^2)$ is a square, and at least one of $(-b \pm \sqrt{-3 \Delta_E} / 36)/2$ is a square.
This implies that the order of $E$ is either odd, or a multiple of *4*.
If $a=0$, this condition is always fulfilled.
* For `secp256k1`, $a=0$ and $b=7.$
* Let the function $g(x) = x^3 + ax + b$, so the $E$ curve equation is also $y^2 = g(x).$
* Let the function $h(x) = 3x^3 + 4a.$
* Define $V$ as the set of solutions $(x_1, x_2, x_3, z)$ to $z^2 = g(x_1)g(x_2)g(x_3).$
* Define $S_u$ as the set of solutions $(X, Y)$ to $X^2 + h(u)Y^2 = -g(u)$ and $Y \neq 0.$
* $P_u$ is a function from $\mathbb{F}$ to $S_u$ that will be defined below.
* $\psi_u$ is a function from $S_u$ to $V$ that will be defined below.
**Note**: In the paper:
* $F_u$ corresponds to $F_{0,u}$ there.
* $P_u(t)$ is called $P$ there.
* All $S_u$ sets together correspond to $S$ there.
* All $\psi_u$ functions together (operating on elements of $S$) correspond to $\psi$ there.
Note that for $V$, the left hand side of the equation $z^2$ is square, and thus the right
hand must also be square. As multiplying non-squares results in a square in $\mathbb{F}$,
out of the three right-hand side factors an even number must be non-squares.
This implies that exactly *1* or exactly *3* out of
$\\{g(x_1), g(x_2), g(x_3)\\}$ must be square, and thus that for any $(x_1,x_2,x_3,z) \in V$,
at least one of $\\{x_1, x_2, x_3\\}$ must be a valid x-coordinate on $E.$ There is one exception
to this, namely when $z=0$, but even then one of the three values is a valid x-coordinate.
**Define** the decoding function $F_u(t)$ as:
* Let $(x_1, x_2, x_3, z) = \psi_u(P_u(t)).$
* Return the first element $x$ of $(x_3, x_2, x_1)$ which is a valid x-coordinate on $E$ (i.e., $g(x)$ is square).
$P_u(t) = (X(u, t), Y(u, t))$, where:
$$
\begin{array}{lcl}
X(u, t) & = & \left\\{\begin{array}{ll}
\dfrac{g(u) - t^2}{2t} & a = 0 \\
\dfrac{g(u) + h(u)(Y_0(u) - X_0(u)t)^2}{X_0(u)(1 + h(u)t^2)} & a \neq 0
\end{array}\right. \\
Y(u, t) & = & \left\\{\begin{array}{ll}
\dfrac{X(u, t) + t}{u \sqrt{-3}} = \dfrac{g(u) + t^2}{2tu\sqrt{-3}} & a = 0 \\
Y_0(u) + t(X(u, t) - X_0(u)) & a \neq 0
\end{array}\right.
\end{array}
$$
$P_u(t)$ is defined:
* For $a=0$, unless:
* $u = 0$ or $t = 0$ (division by zero)
* $g(u) = -t^2$ (would give $Y=0$).
* For $a \neq 0$, unless:
* $X_0(u) = 0$ or $h(u)t^2 = -1$ (division by zero)
* $Y_0(u) (1 - h(u)t^2) = 2X_0(u)t$ (would give $Y=0$).
The functions $X_0(u)$ and $Y_0(u)$ are defined in Appendix A of the paper, and depend on various properties of $E.$
The function $\psi_u$ is the same for all curves: $\psi_u(X, Y) = (x_1, x_2, x_3, z)$, where:
$$
\begin{array}{lcl}
x_1 & = & \dfrac{X}{2Y} - \dfrac{u}{2} && \\
x_2 & = & -\dfrac{X}{2Y} - \dfrac{u}{2} && \\
x_3 & = & u + 4Y^2 && \\
z & = & \dfrac{g(x_3)}{2Y}(u^2 + ux_1 + x_1^2 + a) = \dfrac{-g(u)g(x_3)}{8Y^3}
\end{array}
$$
### 2.1 Decoding for `secp256k1`
Put together and specialized for $a=0$ curves, decoding $(u, t)$ to an x-coordinate is:
**Define** $F_u(t)$ as:
* Let $X = \dfrac{u^3 + b - t^2}{2t}.$
* Let $Y = \dfrac{X + t}{u\sqrt{-3}}.$
* Return the first $x$ in $(u + 4Y^2, \dfrac{-X}{2Y} - \dfrac{u}{2}, \dfrac{X}{2Y} - \dfrac{u}{2})$ for which $g(x)$ is square.
To make sure that every input decodes to a valid x-coordinate, we remap the inputs in case
$P_u$ is not defined (when $u=0$, $t=0$, or $g(u) = -t^2$):
**Define** $F_u(t)$ as:
* Let $u'=u$ if $u \neq 0$; $1$ otherwise (guaranteeing $u' \neq 0$).
* Let $t'=t$ if $t \neq 0$; $1$ otherwise (guaranteeing $t' \neq 0$).
* Let $t''=t'$ if $g(u') \neq -t'^2$; $2t'$ otherwise (guaranteeing $t'' \neq 0$ and $g(u') \neq -t''^2$).
* Let $X = \dfrac{u'^3 + b - t''^2}{2t''}.$
* Let $Y = \dfrac{X + t''}{u'\sqrt{-3}}.$
* Return the first $x$ in $(u' + 4Y^2, \dfrac{-X}{2Y} - \dfrac{u'}{2}, \dfrac{X}{2Y} - \dfrac{u'}{2})$ for which $x^3 + b$ is square.
The choices here are not strictly necessary. Just returning a fixed constant in any of the undefined cases would suffice,
but the approach here is simple enough and gives fairly uniform output even in these cases.
**Note**: in the paper these conditions result in $\infty$ as output, due to the use of projective coordinates there.
We wish to avoid the need for callers to deal with this special case.
This is implemented in `secp256k1_ellswift_xswiftec_frac_var` (which decodes to an x-coordinate represented as a fraction), and
in `secp256k1_ellswift_xswiftec_var` (which outputs the actual x-coordinate).
## 3. The encoding function
To implement $F_u^{-1}(x)$, the function to find the set of inverses $t$ for which $F_u(t) = x$, we have to reverse the process:
* Find all the $(X, Y) \in S_u$ that could have given rise to $x$, through the $x_1$, $x_2$, or $x_3$ formulas in $\psi_u.$
* Map those $(X, Y)$ solutions to $t$ values using $P_u^{-1}(X, Y).$
* For each of the found $t$ values, verify that $F_u(t) = x.$
* Return the remaining $t$ values.
The function $P_u^{-1}$, which finds $t$ given $(X, Y) \in S_u$, is significantly simpler than $P_u:$
$$
P_u^{-1}(X, Y) = \left\\{\begin{array}{ll}
Yu\sqrt{-3} - X & a = 0 \\
\dfrac{Y-Y_0(u)}{X-X_0(u)} & a \neq 0 \land X \neq X_0(u) \\
\dfrac{-X_0(u)}{h(u)Y_0(u)} & a \neq 0 \land X = X_0(u) \land Y = Y_0(u)
\end{array}\right.
$$
The third step above, verifying that $F_u(t) = x$, is necessary because for the $(X, Y)$ values found through the $x_1$ and $x_2$ expressions,
it is possible that decoding through $\psi_u(X, Y)$ yields a valid $x_3$ on the curve, which would take precedence over the
$x_1$ or $x_2$ decoding. These $(X, Y)$ solutions must be rejected.
Since we know that exactly one or exactly three out of $\\{x_1, x_2, x_3\\}$ are valid x-coordinates for any $t$,
the case where either $x_1$ or $x_2$ is valid and in addition also $x_3$ is valid must mean that all three are valid.
This means that instead of checking whether $x_3$ is on the curve, it is also possible to check whether the other one out of
$x_1$ and $x_2$ is on the curve. This is significantly simpler, as it turns out.
Observe that $\psi_u$ guarantees that $x_1 + x_2 = -u.$ So given either $x = x_1$ or $x = x_2$, the other one of the two can be computed as
$-u - x.$ Thus, when encoding $x$ through the $x_1$ or $x_2$ expressions, one can simply check whether $g(-u-x)$ is a square,
and if so, not include the corresponding $t$ values in the returned set. As this does not need $X$, $Y$, or $t$, this condition can be determined
before those values are computed.
It is not possible that an encoding found through the $x_1$ expression decodes to a different valid x-coordinate using $x_2$ (which would
take precedence), for the same reason: if both $x_1$ and $x_2$ decodings were valid, $x_3$ would be valid as well, and thus take
precedence over both. Because of this, the $g(-u-x)$ being square test for $x_1$ and $x_2$ is the only test necessary to guarantee the found $t$
values round-trip back to the input $x$ correctly. This is the reason for choosing the $(x_3, x_2, x_1)$ precedence order in the decoder;
any order which does not place $x_3$ first requires more complicated round-trip checks in the encoder.
### 3.1 Switching to *v, w* coordinates
Before working out the formulas for all this, we switch to different variables for $S_u.$ Let $v = (X/Y - u)/2$, and
$w = 2Y.$ Or in the other direction, $X = w(u/2 + v)$ and $Y = w/2:$
* $S_u'$ becomes the set of $(v, w)$ for which $w^2 (u^2 + uv + v^2 + a) = -g(u)$ and $w \neq 0.$
* For $a=0$ curves, $P_u^{-1}$ can be stated for $(v,w)$ as $P_u^{'-1}(v, w) = w\left(\frac{\sqrt{-3}-1}{2}u - v\right).$
* $\psi_u$ can be stated for $(v, w)$ as $\psi_u'(v, w) = (x_1, x_2, x_3, z)$, where
$$
\begin{array}{lcl}
x_1 & = & v \\
x_2 & = & -u - v \\
x_3 & = & u + w^2 \\
z & = & \dfrac{g(x_3)}{w}(u^2 + uv + v^2 + a) = \dfrac{-g(u)g(x_3)}{w^3}
\end{array}
$$
We can now write the expressions for finding $(v, w)$ given $x$ explicitly, by solving each of the $\\{x_1, x_2, x_3\\}$
expressions for $v$ or $w$, and using the $S_u'$ equation to find the other variable:
* Assuming $x = x_1$, we find $v = x$ and $w = \pm\sqrt{-g(u)/(u^2 + uv + v^2 + a)}$ (two solutions).
* Assuming $x = x_2$, we find $v = -u-x$ and $w = \pm\sqrt{-g(u)/(u^2 + uv + v^2 + a)}$ (two solutions).
* Assuming $x = x_3$, we find $w = \pm\sqrt{x-u}$ and $v = -u/2 \pm \sqrt{-w^2(4g(u) + w^2h(u))}/(2w^2)$ (four solutions).
### 3.2 Avoiding computing all inverses
The *ElligatorSwift* algorithm as stated in Section 1 requires the computation of $L = F_u^{-1}(x)$ (the
set of all $t$ such that $(u, t)$ decode to $x$) in full. This is unnecessary.
Observe that the procedure of restarting with probability $(1 - \frac{\\#L}{8})$ and otherwise returning a
uniformly random element from $L$ is actually equivalent to always padding $L$ with $\bot$ values up to length 8,
picking a uniformly random element from that, restarting whenever $\bot$ is picked:
**Define** *ElligatorSwift(x)* as:
* Loop:
* Pick a uniformly random field element $u.$
* Compute the set $L = F_u^{-1}(x).$
* Let $T$ be the 8-element vector consisting of the elements of $L$, plus $8 - \\#L$ times $\\{\bot\\}.$
* Select a uniformly random $t \in T.$
* If $t \neq \bot$, return $(u, t)$; restart loop otherwise.
Now notice that the order of elements in $T$ does not matter, as all we do is pick a uniformly
random element in it, so we do not need to have all $\bot$ values at the end.
As we have 8 distinct formulas for finding $(v, w)$ (taking the variants due to $\pm$ into account),
we can associate every index in $T$ with exactly one of those formulas, making sure that:
* Formulas that yield no solutions (due to division by zero or non-existing square roots) or invalid solutions are made to return $\bot.$
* For the $x_1$ and $x_2$ cases, if $g(-u-x)$ is a square, $\bot$ is returned instead (the round-trip check).
* In case multiple formulas would return the same non- $\bot$ result, all but one of those must be turned into $\bot$ to avoid biasing those.
The last condition above only occurs with negligible probability for cryptographically-sized curves, but is interesting
to take into account as it allows exhaustive testing in small groups. See [Section 3.4](#34-dealing-with-special-cases)
for an analysis of all the negligible cases.
If we define $T = (G_{0,u}(x), G_{1,u}(x), \ldots, G_{7,u}(x))$, with each $G_{i,u}$ matching one of the formulas,
the loop can be simplified to only compute one of the inverses instead of all of them:
**Define** *ElligatorSwift(x)* as:
* Loop:
* Pick a uniformly random field element $u.$
* Pick a uniformly random integer $c$ in $[0,8).$
* Let $t = G_{c,u}(x).$
* If $t \neq \bot$, return $(u, t)$; restart loop otherwise.
This is implemented in `secp256k1_ellswift_xelligatorswift_var`.
### 3.3 Finding the inverse
To implement $G_{c,u}$, we map $c=0$ to the $x_1$ formula, $c=1$ to the $x_2$ formula, and $c=2$ and $c=3$ to the $x_3$ formula.
Those are then repeated as $c=4$ through $c=7$ for the other sign of $w$ (noting that in each formula, $w$ is a square root of some expression).
Ignoring the negligible cases, we get:
**Define** $G_{c,u}(x)$ as:
* If $c \in \\{0, 1, 4, 5\\}$ (for $x_1$ and $x_2$ formulas):
* If $g(-u-x)$ is square, return $\bot$ (as $x_3$ would be valid and take precedence).
* If $c \in \\{0, 4\\}$ (the $x_1$ formula) let $v = x$, otherwise let $v = -u-x$ (the $x_2$ formula)
* Let $s = -g(u)/(u^2 + uv + v^2 + a)$ (using $s = w^2$ in what follows).
* Otherwise, when $c \in \\{2, 3, 6, 7\\}$ (for $x_3$ formulas):
* Let $s = x-u.$
* Let $r = \sqrt{-s(4g(u) + sh(u))}.$
* Let $v = (r/s - u)/2$ if $c \in \\{3, 7\\}$; $(-r/s - u)/2$ otherwise.
* Let $w = \sqrt{s}.$
* Depending on $c:$
* If $c \in \\{0, 1, 2, 3\\}:$ return $P_u^{'-1}(v, w).$
* If $c \in \\{4, 5, 6, 7\\}:$ return $P_u^{'-1}(v, -w).$
Whenever a square root of a non-square is taken, $\bot$ is returned; for both square roots this happens with roughly
50% on random inputs. Similarly, when a division by 0 would occur, $\bot$ is returned as well; this will only happen
with negligible probability. A division by 0 in the first branch in fact cannot occur at all, because $u^2 + uv + v^2 + a = 0$
implies $g(-u-x) = g(x)$ which would mean the $g(-u-x)$ is square condition has triggered
and $\bot$ would have been returned already.
**Note**: In the paper, the $case$ variable corresponds roughly to the $c$ above, but only takes on 4 possible values (1 to 4).
The conditional negation of $w$ at the end is done randomly, which is equivalent, but makes testing harder. We choose to
have the $G_{c,u}$ be deterministic, and capture all choices in $c.$
Now observe that the $c \in \\{1, 5\\}$ and $c \in \\{3, 7\\}$ conditions effectively perform the same $v \rightarrow -u-v$
transformation. Furthermore, that transformation has no effect on $s$ in the first branch
as $u^2 + ux + x^2 + a = u^2 + u(-u-x) + (-u-x)^2 + a.$ Thus we can extract it out and move it down:
**Define** $G_{c,u}(x)$ as:
* If $c \in \\{0, 1, 4, 5\\}:$
* If $g(-u-x)$ is square, return $\bot.$
* Let $s = -g(u)/(u^2 + ux + x^2 + a).$
* Let $v = x.$
* Otherwise, when $c \in \\{2, 3, 6, 7\\}:$
* Let $s = x-u.$
* Let $r = \sqrt{-s(4g(u) + sh(u))}.$
* Let $v = (r/s - u)/2.$
* Let $w = \sqrt{s}.$
* Depending on $c:$
* If $c \in \\{0, 2\\}:$ return $P_u^{'-1}(v, w).$
* If $c \in \\{1, 3\\}:$ return $P_u^{'-1}(-u-v, w).$
* If $c \in \\{4, 6\\}:$ return $P_u^{'-1}(v, -w).$
* If $c \in \\{5, 7\\}:$ return $P_u^{'-1}(-u-v, -w).$
This shows there will always be exactly 0, 4, or 8 $t$ values for a given $(u, x)$ input.
There can be 0, 1, or 2 $(v, w)$ pairs before invoking $P_u^{'-1}$, and each results in 4 distinct $t$ values.
### 3.4 Dealing with special cases
As mentioned before there are a few cases to deal with which only happen in a negligibly small subset of inputs.
For cryptographically sized fields, if only random inputs are going to be considered, it is unnecessary to deal with these. Still, for completeness
we analyse them here. They generally fall into two categories: cases in which the encoder would produce $t$ values that
do not decode back to $x$ (or at least cannot guarantee that they do), and cases in which the encoder might produce the same
$t$ value for multiple $c$ inputs (thereby biasing that encoding):
* In the branch for $x_1$ and $x_2$ (where $c \in \\{0, 1, 4, 5\\}$):
* When $g(u) = 0$, we would have $s=w=Y=0$, which is not on $S_u.$ This is only possible on even-ordered curves.
Excluding this also removes the one condition under which the simplified check for $x_3$ on the curve
fails (namely when $g(x_1)=g(x_2)=0$ but $g(x_3)$ is not square).
This does exclude some valid encodings: when both $g(u)=0$ and $u^2+ux+x^2+a=0$ (also implying $g(x)=0$),
the $S_u'$ equation degenerates to $0 = 0$, and many valid $t$ values may exist. Yet, these cannot be targeted uniformly by the
encoder anyway as there will generally be more than 8.
* When $g(x) = 0$, the same $t$ would be produced as in the $x_3$ branch (where $c \in \\{2, 3, 6, 7\\}$) which we give precedence
as it can deal with $g(u)=0$.
This is again only possible on even-ordered curves.
* In the branch for $x_3$ (where $c \in \\{2, 3, 6, 7\\}$):
* When $s=0$, a division by zero would occur.
* When $v = -u-v$ and $c \in \\{3, 7\\}$, the same $t$ would be returned as in the $c \in \\{2, 6\\}$ cases.
It is equivalent to checking whether $r=0$.
This cannot occur in the $x_1$ or $x_2$ branches, as it would trigger the $g(-u-x)$ is square condition.
A similar concern for $w = -w$ does not exist, as $w=0$ is already impossible in both branches: in the first
it requires $g(u)=0$ which is already outlawed on even-ordered curves and impossible on others; in the second it would trigger division by zero.
* Curve-specific special cases also exist that need to be rejected, because they result in $(u,t)$ which is invalid to the decoder, or because of division by zero in the encoder:
* For $a=0$ curves, when $u=0$ or when $t=0$. The latter can only be reached by the encoder when $g(u)=0$, which requires an even-ordered curve.
* For $a \neq 0$ curves, when $X_0(u)=0$, when $h(u)t^2 = -1$, or when $w(u + 2v) = 2X_0(u)$ while also either $w \neq 2Y_0(u)$ or $h(u)=0$.
**Define** a version of $G_{c,u}(x)$ which deals with all these cases:
* If $a=0$ and $u=0$, return $\bot.$
* If $a \neq 0$ and $X_0(u)=0$, return $\bot.$
* If $c \in \\{0, 1, 4, 5\\}:$
* If $g(u) = 0$ or $g(x) = 0$, return $\bot$ (even curves only).
* If $g(-u-x)$ is square, return $\bot.$
* Let $s = -g(u)/(u^2 + ux + x^2 + a)$ (cannot cause division by zero).
* Let $v = x.$
* Otherwise, when $c \in \\{2, 3, 6, 7\\}:$
* Let $s = x-u.$
* Let $r = \sqrt{-s(4g(u) + sh(u))}$; return $\bot$ if not square.
* If $c \in \\{3, 7\\}$ and $r=0$, return $\bot.$
* If $s = 0$, return $\bot.$
* Let $v = (r/s - u)/2.$
* Let $w = \sqrt{s}$; return $\bot$ if not square.
* If $a \neq 0$ and $w(u+2v) = 2X_0(u)$ and either $w \neq 2Y_0(u)$ or $h(u) = 0$, return $\bot.$
* Depending on $c:$
* If $c \in \\{0, 2\\}$, let $t = P_u^{'-1}(v, w).$
* If $c \in \\{1, 3\\}$, let $t = P_u^{'-1}(-u-v, w).$
* If $c \in \\{4, 6\\}$, let $t = P_u^{'-1}(v, -w).$
* If $c \in \\{5, 7\\}$, let $t = P_u^{'-1}(-u-v, -w).$
* If $a=0$ and $t=0$, return $\bot$ (even curves only).
* If $a \neq 0$ and $h(u)t^2 = -1$, return $\bot.$
* Return $t.$
Given any $u$, using this algorithm over all $x$ and $c$ values, every $t$ value will be reached exactly once,
for an $x$ for which $F_u(t) = x$ holds, except for these cases that will not be reached:
* All cases where $P_u(t)$ is not defined:
* For $a=0$ curves, when $u=0$, $t=0$, or $g(u) = -t^2.$
* For $a \neq 0$ curves, when $h(u)t^2 = -1$, $X_0(u) = 0$, or $Y_0(u) (1 - h(u) t^2) = 2X_0(u)t.$
* When $g(u)=0$, the potentially many $t$ values that decode to an $x$ satisfying $g(x)=0$ using the $x_2$ formula. These were excluded by the $g(u)=0$ condition in the $c \in \\{0, 1, 4, 5\\}$ branch.
These cases form a negligible subset of all $(u, t)$ for cryptographically sized curves.
### 3.5 Encoding for `secp256k1`
Specialized for odd-ordered $a=0$ curves:
**Define** $G_{c,u}(x)$ as:
* If $u=0$, return $\bot.$
* If $c \in \\{0, 1, 4, 5\\}:$
* If $(-u-x)^3 + b$ is square, return $\bot$
* Let $s = -(u^3 + b)/(u^2 + ux + x^2)$ (cannot cause division by 0).
* Let $v = x.$
* Otherwise, when $c \in \\{2, 3, 6, 7\\}:$
* Let $s = x-u.$
* Let $r = \sqrt{-s(4(u^3 + b) + 3su^2)}$; return $\bot$ if not square.
* If $c \in \\{3, 7\\}$ and $r=0$, return $\bot.$
* If $s = 0$, return $\bot.$
* Let $v = (r/s - u)/2.$
* Let $w = \sqrt{s}$; return $\bot$ if not square.
* Depending on $c:$
* If $c \in \\{0, 2\\}:$ return $w(\frac{\sqrt{-3}-1}{2}u - v).$
* If $c \in \\{1, 3\\}:$ return $w(\frac{\sqrt{-3}+1}{2}u + v).$
* If $c \in \\{4, 6\\}:$ return $w(\frac{-\sqrt{-3}+1}{2}u + v).$
* If $c \in \\{5, 7\\}:$ return $w(\frac{-\sqrt{-3}-1}{2}u - v).$
This is implemented in `secp256k1_ellswift_xswiftec_inv_var`.
And the x-only ElligatorSwift encoding algorithm is still:
**Define** *ElligatorSwift(x)* as:
* Loop:
* Pick a uniformly random field element $u.$
* Pick a uniformly random integer $c$ in $[0,8).$
* Let $t = G_{c,u}(x).$
* If $t \neq \bot$, return $(u, t)$; restart loop otherwise.
Note that this logic does not take the remapped $u=0$, $t=0$, and $g(u) = -t^2$ cases into account; it just avoids them.
While it is not impossible to make the encoder target them, this would increase the maximum number of $t$ values for a given $(u, x)$
combination beyond 8, and thereby slow down the ElligatorSwift loop proportionally, for a negligible gain in uniformity.
## 4. Encoding and decoding full *(x, y)* coordinates
So far we have only addressed encoding and decoding x-coordinates, but in some cases an encoding
for full points with $(x, y)$ coordinates is desirable. It is possible to encode this information
in $t$ as well.
Note that for any $(X, Y) \in S_u$, $(\pm X, \pm Y)$ are all on $S_u.$ Moreover, all of these are
mapped to the same x-coordinate. Negating $X$ or negating $Y$ just results in $x_1$ and $x_2$
being swapped, and does not affect $x_3.$ This will not change the outcome x-coordinate as the order
of $x_1$ and $x_2$ only matters if both were to be valid, and in that case $x_3$ would be used instead.
Still, these four $(X, Y)$ combinations all correspond to distinct $t$ values, so we can encode
the sign of the y-coordinate in the sign of $X$ or the sign of $Y.$ They correspond to the
four distinct $P_u^{'-1}$ calls in the definition of $G_{u,c}.$
**Note**: In the paper, the sign of the y coordinate is encoded in a separately-coded bit.
To encode the sign of $y$ in the sign of $Y:$
**Define** *Decode(u, t)* for full $(x, y)$ as:
* Let $(X, Y) = P_u(t).$
* Let $x$ be the first value in $(u + 4Y^2, \frac{-X}{2Y} - \frac{u}{2}, \frac{X}{2Y} - \frac{u}{2})$ for which $g(x)$ is square.
* Let $y = \sqrt{g(x)}.$
* If $sign(y) = sign(Y)$, return $(x, y)$; otherwise return $(x, -y).$
And encoding would be done using a $G_{c,u}(x, y)$ function defined as:
**Define** $G_{c,u}(x, y)$ as:
* If $c \in \\{0, 1\\}:$
* If $g(u) = 0$ or $g(x) = 0$, return $\bot$ (even curves only).
* If $g(-u-x)$ is square, return $\bot.$
* Let $s = -g(u)/(u^2 + ux + x^2 + a)$ (cannot cause division by zero).
* Let $v = x.$
* Otherwise, when $c \in \\{2, 3\\}:$
* Let $s = x-u.$
* Let $r = \sqrt{-s(4g(u) + sh(u))}$; return $\bot$ if not square.
* If $c = 3$ and $r = 0$, return $\bot.$
* Let $v = (r/s - u)/2.$
* Let $w = \sqrt{s}$; return $\bot$ if not square.
* Let $w' = w$ if $sign(w/2) = sign(y)$; $-w$ otherwise.
* Depending on $c:$
* If $c \in \\{0, 2\\}:$ return $P_u^{'-1}(v, w').$
* If $c \in \\{1, 3\\}:$ return $P_u^{'-1}(-u-v, w').$
Note that $c$ now only ranges $[0,4)$, as the sign of $w'$ is decided based on that of $y$, rather than on $c.$
This change makes some valid encodings unreachable: when $y = 0$ and $sign(Y) \neq sign(0)$.
In the above logic, $sign$ can be implemented in several ways, such as parity of the integer representation
of the input field element (for prime-sized fields) or the quadratic residuosity (for fields where
$-1$ is not square). The choice does not matter, as long as it only takes on two possible values, and for $x \neq 0$ it holds that $sign(x) \neq sign(-x)$.
### 4.1 Full *(x, y)* coordinates for `secp256k1`
For $a=0$ curves, there is another option. Note that for those,
the $P_u(t)$ function translates negations of $t$ to negations of (both) $X$ and $Y.$ Thus, we can use $sign(t)$ to
encode the y-coordinate directly. Combined with the earlier remapping to guarantee all inputs land on the curve, we get
as decoder:
**Define** *Decode(u, t)* as:
* Let $u'=u$ if $u \neq 0$; $1$ otherwise.
* Let $t'=t$ if $t \neq 0$; $1$ otherwise.
* Let $t''=t'$ if $u'^3 + b + t'^2 \neq 0$; $2t'$ otherwise.
* Let $X = \dfrac{u'^3 + b - t''^2}{2t''}.$
* Let $Y = \dfrac{X + t''}{u'\sqrt{-3}}.$
* Let $x$ be the first element of $(u' + 4Y^2, \frac{-X}{2Y} - \frac{u'}{2}, \frac{X}{2Y} - \frac{u'}{2})$ for which $g(x)$ is square.
* Let $y = \sqrt{g(x)}.$
* Return $(x, y)$ if $sign(y) = sign(t)$; $(x, -y)$ otherwise.
This is implemented in `secp256k1_ellswift_swiftec_var`. The used $sign(x)$ function is the parity of $x$ when represented as in integer in $[0,q).$
The corresponding encoder would invoke the x-only one, but negating the output $t$ if $sign(t) \neq sign(y).$
This is implemented in `secp256k1_ellswift_elligatorswift_var`.
Note that this is only intended for encoding points where both the x-coordinate and y-coordinate are unpredictable. When encoding x-only points
where the y-coordinate is implicitly even (or implicitly square, or implicitly in $[0,q/2]$), the encoder in
[Section 3.5](#35-encoding-for-secp256k1) must be used, or a bias is reintroduced that undoes all the benefit of using ElligatorSwift
in the first place.

54
external/secp256k1/doc/musig.md vendored Normal file
View File

@@ -0,0 +1,54 @@
Notes on the musig module API
===========================
The following sections contain additional notes on the API of the musig module (`include/secp256k1_musig.h`).
A usage example can be found in `examples/musig.c`.
## API misuse
The musig API is designed with a focus on misuse resistance.
However, due to the interactive nature of the MuSig protocol, there are additional failure modes that are not present in regular (single-party) Schnorr signature creation.
While the results can be catastrophic (e.g. leaking of the secret key), it is unfortunately not possible for the musig implementation to prevent all such failure modes.
Therefore, users of the musig module must take great care to make sure of the following:
1. A unique nonce per signing session is generated in `secp256k1_musig_nonce_gen`.
See the corresponding comment in `include/secp256k1_musig.h` for how to ensure that.
2. The `secp256k1_musig_secnonce` structure is never copied or serialized.
See also the comment on `secp256k1_musig_secnonce` in `include/secp256k1_musig.h`.
3. Opaque data structures are never written to or read from directly.
Instead, only the provided accessor functions are used.
## Key Aggregation and (Taproot) Tweaking
Given a set of public keys, the aggregate public key is computed with `secp256k1_musig_pubkey_agg`.
A plain tweak can be added to the resulting public key with `secp256k1_ec_pubkey_tweak_add` by setting the `tweak32` argument to the hash defined in BIP 32. Similarly, a Taproot tweak can be added with `secp256k1_xonly_pubkey_tweak_add` by setting the `tweak32` argument to the TapTweak hash defined in BIP 341.
Both types of tweaking can be combined and invoked multiple times if the specific application requires it.
## Signing
This is covered by `examples/musig.c`.
Essentially, the protocol proceeds in the following steps:
1. Generate a keypair with `secp256k1_keypair_create` and obtain the public key with `secp256k1_keypair_pub`.
2. Call `secp256k1_musig_pubkey_agg` with the pubkeys of all participants.
3. Optionally add a (Taproot) tweak with `secp256k1_musig_pubkey_xonly_tweak_add` and a plain tweak with `secp256k1_musig_pubkey_ec_tweak_add`.
4. Generate a pair of secret and public nonce with `secp256k1_musig_nonce_gen` and send the public nonce to the other signers.
5. Someone (not necessarily the signer) aggregates the public nonces with `secp256k1_musig_nonce_agg` and sends it to the signers.
6. Process the aggregate nonce with `secp256k1_musig_nonce_process`.
7. Create a partial signature with `secp256k1_musig_partial_sign`.
8. Verify the partial signatures (optional in some scenarios) with `secp256k1_musig_partial_sig_verify`.
9. Someone (not necessarily the signer) obtains all partial signatures and aggregates them into the final Schnorr signature using `secp256k1_musig_partial_sig_agg`.
The aggregate signature can be verified with `secp256k1_schnorrsig_verify`.
Steps 1 through 5 above can occur before or after the signers are aware of the message to be signed.
Whenever possible, it is recommended to generate the nonces only after the message is known.
This provides enhanced defense-in-depth measures, protecting against potential API misuse in certain scenarios.
However, it does require two rounds of communication during the signing process.
The alternative, generating the nonces in a pre-processing step before the message is known, eliminates these additional protective measures but allows for non-interactive signing.
Similarly, the API supports an alternative protocol flow where generating the aggregate key (steps 1 to 3) is allowed to happen after exchanging nonces (steps 4 to 5).
## Verification
A participant who wants to verify the partial signatures, but does not sign itself may do so using the above instructions except that the verifier skips steps 1, 4 and 7.

View File

@@ -1,4 +1,4 @@
# Release Process
# Release process
This document outlines the process for releasing versions of the form `$MAJOR.$MINOR.$PATCH`.
@@ -12,50 +12,83 @@ It is best if the maintainers are present during the release, so they can help e
This process also assumes that there will be no minor releases for old major releases.
We aim to cut a regular release every 3-4 months, approximately twice as frequent as major Bitcoin Core releases. Every second release should be published one month before the feature freeze of the next major Bitcoin Core release, allowing sufficient time to update the library in Core.
## Sanity checks
Perform these checks when reviewing the release PR (see below):
1. Ensure `make distcheck` doesn't fail.
```shell
./autogen.sh && ./configure --enable-dev-mode && make distcheck
```
2. Check installation with autotools:
```shell
dir=$(mktemp -d)
./autogen.sh && ./configure --prefix=$dir && make clean && make install && ls -RlAh $dir
gcc -o ecdsa examples/ecdsa.c $(PKG_CONFIG_PATH=$dir/lib/pkgconfig pkg-config --cflags --libs libsecp256k1) -Wl,-rpath,"$dir/lib" && ./ecdsa
```
3. Check installation with CMake:
```shell
dir=$(mktemp -d)
build=$(mktemp -d)
cmake -B $build -DCMAKE_INSTALL_PREFIX=$dir && cmake --build $build && cmake --install $build && ls -RlAh $dir
gcc -o ecdsa examples/ecdsa.c -I $dir/include -L $dir/lib*/ -l secp256k1 -Wl,-rpath,"$dir/lib",-rpath,"$dir/lib64" && ./ecdsa
```
4. Use the [`check-abi.sh`](/tools/check-abi.sh) tool to verify that there are no unexpected ABI incompatibilities and that the version number and the release notes accurately reflect all potential ABI changes. To run this tool, the `abi-dumper` and `abi-compliance-checker` packages are required.
```shell
tools/check-abi.sh
```
## Regular release
1. Open a PR to the master branch with a commit (using message `"release: prepare for $MAJOR.$MINOR.$PATCH"`, for example) that
* finalizes the release notes in [CHANGELOG.md](../CHANGELOG.md) (make sure to include an entry for `### ABI Compatibility`),
* sets `_PKG_VERSION_IS_RELEASE` to `true` in `configure.ac`, and
* if this is not a patch release
* updates `_PKG_VERSION_*` and `_LIB_VERSION_*` in `configure.ac` and
* finalizes the release notes in [CHANGELOG.md](../CHANGELOG.md) by
* adding a section for the release (make sure that the version number is a link to a diff between the previous and new version),
* removing the `[Unreleased]` section header,
* ensuring that the release notes are not missing entries (check the `needs-changelog` label on github), and
* including an entry for `### ABI Compatibility` if it doesn't exist,
* sets `_PKG_VERSION_IS_RELEASE` to `true` in `configure.ac`, and,
* if this is not a patch release,
* updates `_PKG_VERSION_*` and `_LIB_VERSION_*` in `configure.ac`, and
* updates `project(libsecp256k1 VERSION ...)` and `${PROJECT_NAME}_LIB_VERSION_*` in `CMakeLists.txt`.
2. After the PR is merged, tag the commit and push it:
2. Perform the [sanity checks](#sanity-checks) on the PR branch.
3. After the PR is merged, tag the commit, and push the tag:
```
RELEASE_COMMIT=<merge commit of step 1>
git tag -s v$MAJOR.$MINOR.$PATCH -m "libsecp256k1 $MAJOR.$MINOR.$PATCH" $RELEASE_COMMIT
git push git@github.com:bitcoin-core/secp256k1.git v$MAJOR.$MINOR.$PATCH
```
3. Open a PR to the master branch with a commit (using message `"release cleanup: bump version after $MAJOR.$MINOR.$PATCH"`, for example) that
* sets `_PKG_VERSION_IS_RELEASE` to `false` and increments `_PKG_VERSION_PATCH` and `_LIB_VERSION_REVISION` in `configure.ac`, and
* increments the `$PATCH` component of `project(libsecp256k1 VERSION ...)` and `${PROJECT_NAME}_LIB_VERSION_REVISION` in `CMakeLists.txt`.
4. Open a PR to the master branch with a commit (using message `"release cleanup: bump version after $MAJOR.$MINOR.$PATCH"`, for example) that
* sets `_PKG_VERSION_IS_RELEASE` to `false` and increments `_PKG_VERSION_PATCH` and `_LIB_VERSION_REVISION` in `configure.ac`,
* increments the `$PATCH` component of `project(libsecp256k1 VERSION ...)` and `${PROJECT_NAME}_LIB_VERSION_REVISION` in `CMakeLists.txt`, and
* adds an `[Unreleased]` section header to the [CHANGELOG.md](../CHANGELOG.md).
If other maintainers are not present to approve the PR, it can be merged without ACKs.
4. Create a new GitHub release with a link to the corresponding entry in [CHANGELOG.md](../CHANGELOG.md).
5. Create a new GitHub release with a link to the corresponding entry in [CHANGELOG.md](../CHANGELOG.md).
6. Send an announcement email to the bitcoin-dev mailing list.
## Maintenance release
Note that bugfixes only need to be backported to releases for which no compatible release without the bug exists.
Note that bug fixes need to be backported only to releases for which no compatible release without the bug exists.
1. If `$PATCH = 1`, create maintenance branch `$MAJOR.$MINOR`:
1. If there's no maintenance branch `$MAJOR.$MINOR`, create one:
```
git checkout -b $MAJOR.$MINOR v$MAJOR.$MINOR.0
git checkout -b $MAJOR.$MINOR v$MAJOR.$MINOR.$((PATCH - 1))
git push git@github.com:bitcoin-core/secp256k1.git $MAJOR.$MINOR
```
2. Open a pull request to the `$MAJOR.$MINOR` branch that
* includes the bugfixes,
* finalizes the release notes,
* includes the bug fixes,
* finalizes the release notes similar to a regular release,
* increments `_PKG_VERSION_PATCH` and `_LIB_VERSION_REVISION` in `configure.ac`
and the `$PATCH` component of `project(libsecp256k1 VERSION ...)` and `${PROJECT_NAME}_LIB_VERSION_REVISION` in `CMakeLists.txt`
(with commit message `"release: bump versions for $MAJOR.$MINOR.$PATCH"`, for example).
3. After the PRs are merged, update the release branch and tag the commit:
3. Perform the [sanity checks](#sanity-checks) on the PR branch.
4. After the PRs are merged, update the release branch, tag the commit, and push the tag:
```
git checkout $MAJOR.$MINOR && git pull
git tag -s v$MAJOR.$MINOR.$PATCH -m "libsecp256k1 $MAJOR.$MINOR.$PATCH"
```
4. Push tag:
```
git push git@github.com:bitcoin-core/secp256k1.git v$MAJOR.$MINOR.$PATCH
```
5. Create a new GitHub release with a link to the corresponding entry in [CHANGELOG.md](../CHANGELOG.md).
6. Open PR to the master branch that includes a commit (with commit message `"release notes: add $MAJOR.$MINOR.$PATCH"`, for example) that adds release notes to [CHANGELOG.md](../CHANGELOG.md).
6. Create a new GitHub release with a link to the corresponding entry in [CHANGELOG.md](../CHANGELOG.md).
7. Send an announcement email to the bitcoin-dev mailing list.
8. Open PR to the master branch that includes a commit (with commit message `"release notes: add $MAJOR.$MINOR.$PATCH"`, for example) that adds release notes to [CHANGELOG.md](../CHANGELOG.md).

View File

@@ -1,27 +1,31 @@
add_library(example INTERFACE)
target_include_directories(example INTERFACE
${PROJECT_SOURCE_DIR}/include
)
target_link_libraries(example INTERFACE
secp256k1
$<$<PLATFORM_ID:Windows>:bcrypt>
)
if(NOT BUILD_SHARED_LIBS AND MSVC)
target_link_options(example INTERFACE /IGNORE:4217)
endif()
function(add_example name)
set(target_name ${name}_example)
add_executable(${target_name} ${name}.c)
target_include_directories(${target_name} PRIVATE
${PROJECT_SOURCE_DIR}/include
)
target_link_libraries(${target_name}
secp256k1
$<$<PLATFORM_ID:Windows>:bcrypt>
)
set(test_name ${name}_example)
add_test(NAME secp256k1_${test_name} COMMAND ${target_name})
endfunction()
add_executable(ecdsa_example ecdsa.c)
target_link_libraries(ecdsa_example example)
add_test(NAME ecdsa_example COMMAND ecdsa_example)
add_example(ecdsa)
if(SECP256K1_ENABLE_MODULE_ECDH)
add_executable(ecdh_example ecdh.c)
target_link_libraries(ecdh_example example)
add_test(NAME ecdh_example COMMAND ecdh_example)
add_example(ecdh)
endif()
if(SECP256K1_ENABLE_MODULE_SCHNORRSIG)
add_executable(schnorr_example schnorr.c)
target_link_libraries(schnorr_example example)
add_test(NAME schnorr_example COMMAND schnorr_example)
add_example(schnorr)
endif()
if(SECP256K1_ENABLE_MODULE_ELLSWIFT)
add_example(ellswift)
endif()
if(SECP256K1_ENABLE_MODULE_MUSIG)
add_example(musig)
endif()

View File

@@ -42,18 +42,16 @@ int main(void) {
assert(return_val);
/*** Key Generation ***/
/* If the secret key is zero or out of range (bigger than secp256k1's
* order), we try to sample a new key. Note that the probability of this
* happening is negligible. */
while (1) {
if (!fill_random(seckey1, sizeof(seckey1)) || !fill_random(seckey2, sizeof(seckey2))) {
printf("Failed to generate randomness\n");
return 1;
}
if (secp256k1_ec_seckey_verify(ctx, seckey1) && secp256k1_ec_seckey_verify(ctx, seckey2)) {
break;
}
if (!fill_random(seckey1, sizeof(seckey1)) || !fill_random(seckey2, sizeof(seckey2))) {
printf("Failed to generate randomness\n");
return 1;
}
/* If the secret key is zero or out of range (greater than secp256k1's
* order), we fail. Note that the probability of this occurring is negligible
* with a properly functioning random number generator. */
if (!secp256k1_ec_seckey_verify(ctx, seckey1) || !secp256k1_ec_seckey_verify(ctx, seckey2)) {
printf("Generated secret key is invalid. This indicates an issue with the random number generator.\n");
return 1;
}
/* Public key creation using a valid context with a verified secret key should never fail */
@@ -108,7 +106,7 @@ int main(void) {
/* It's best practice to try to clear secrets from memory after using them.
* This is done because some bugs can allow an attacker to leak memory, for
* example through "out of bounds" array access (see Heartbleed), Or the OS
* example through "out of bounds" array access (see Heartbleed), or the OS
* swapping them to disk. Hence, we overwrite the secret key buffer with zeros.
*
* Here we are preventing these writes from being optimized out, as any good compiler

View File

@@ -49,18 +49,16 @@ int main(void) {
assert(return_val);
/*** Key Generation ***/
/* If the secret key is zero or out of range (bigger than secp256k1's
* order), we try to sample a new key. Note that the probability of this
* happening is negligible. */
while (1) {
if (!fill_random(seckey, sizeof(seckey))) {
printf("Failed to generate randomness\n");
return 1;
}
if (secp256k1_ec_seckey_verify(ctx, seckey)) {
break;
}
if (!fill_random(seckey, sizeof(seckey))) {
printf("Failed to generate randomness\n");
return 1;
}
/* If the secret key is zero or out of range (greater than secp256k1's
* order), we fail. Note that the probability of this occurring is negligible
* with a properly functioning random number generator. */
if (!secp256k1_ec_seckey_verify(ctx, seckey)) {
printf("Generated secret key is invalid. This indicates an issue with the random number generator.\n");
return 1;
}
/* Public key creation using a valid context with a verified secret key should never fail */
@@ -128,7 +126,7 @@ int main(void) {
/* It's best practice to try to clear secrets from memory after using them.
* This is done because some bugs can allow an attacker to leak memory, for
* example through "out of bounds" array access (see Heartbleed), Or the OS
* example through "out of bounds" array access (see Heartbleed), or the OS
* swapping them to disk. Hence, we overwrite the secret key buffer with zeros.
*
* Here we are preventing these writes from being optimized out, as any good compiler

121
external/secp256k1/examples/ellswift.c vendored Normal file
View File

@@ -0,0 +1,121 @@
/*************************************************************************
* Written in 2024 by Sebastian Falbesoner *
* To the extent possible under law, the author(s) have dedicated all *
* copyright and related and neighboring rights to the software in this *
* file to the public domain worldwide. This software is distributed *
* without any warranty. For the CC0 Public Domain Dedication, see *
* EXAMPLES_COPYING or https://creativecommons.org/publicdomain/zero/1.0 *
*************************************************************************/
/** This file demonstrates how to use the ElligatorSwift module to perform
* a key exchange according to BIP 324. Additionally, see the documentation
* in include/secp256k1_ellswift.h and doc/ellswift.md.
*/
#include <stdio.h>
#include <assert.h>
#include <string.h>
#include <secp256k1.h>
#include <secp256k1_ellswift.h>
#include "examples_util.h"
int main(void) {
secp256k1_context* ctx;
unsigned char randomize[32];
unsigned char auxrand1[32];
unsigned char auxrand2[32];
unsigned char seckey1[32];
unsigned char seckey2[32];
unsigned char ellswift_pubkey1[64];
unsigned char ellswift_pubkey2[64];
unsigned char shared_secret1[32];
unsigned char shared_secret2[32];
int return_val;
/* Create a secp256k1 context */
ctx = secp256k1_context_create(SECP256K1_CONTEXT_NONE);
if (!fill_random(randomize, sizeof(randomize))) {
printf("Failed to generate randomness\n");
return 1;
}
/* Randomizing the context is recommended to protect against side-channel
* leakage. See `secp256k1_context_randomize` in secp256k1.h for more
* information about it. This should never fail. */
return_val = secp256k1_context_randomize(ctx, randomize);
assert(return_val);
/*** Generate secret keys ***/
if (!fill_random(seckey1, sizeof(seckey1)) || !fill_random(seckey2, sizeof(seckey2))) {
printf("Failed to generate randomness\n");
return 1;
}
/* If the secret key is zero or out of range (greater than secp256k1's
* order), we fail. Note that the probability of this occurring is negligible
* with a properly functioning random number generator. */
if (!secp256k1_ec_seckey_verify(ctx, seckey1) || !secp256k1_ec_seckey_verify(ctx, seckey2)) {
printf("Generated secret key is invalid. This indicates an issue with the random number generator.\n");
return 1;
}
/* Generate ElligatorSwift public keys. This should never fail with valid context and
verified secret keys. Note that providing additional randomness (fourth parameter) is
optional, but recommended. */
if (!fill_random(auxrand1, sizeof(auxrand1)) || !fill_random(auxrand2, sizeof(auxrand2))) {
printf("Failed to generate randomness\n");
return 1;
}
return_val = secp256k1_ellswift_create(ctx, ellswift_pubkey1, seckey1, auxrand1);
assert(return_val);
return_val = secp256k1_ellswift_create(ctx, ellswift_pubkey2, seckey2, auxrand2);
assert(return_val);
/*** Create the shared secret on each side ***/
/* Perform x-only ECDH with seckey1 and ellswift_pubkey2. Should never fail
* with a verified seckey and valid pubkey. Note that both parties pass both
* EllSwift pubkeys in the same order; the pubkey of the calling party is
* determined by the "party" boolean (sixth parameter). */
return_val = secp256k1_ellswift_xdh(ctx, shared_secret1, ellswift_pubkey1, ellswift_pubkey2,
seckey1, 0, secp256k1_ellswift_xdh_hash_function_bip324, NULL);
assert(return_val);
/* Perform x-only ECDH with seckey2 and ellswift_pubkey1. Should never fail
* with a verified seckey and valid pubkey. */
return_val = secp256k1_ellswift_xdh(ctx, shared_secret2, ellswift_pubkey1, ellswift_pubkey2,
seckey2, 1, secp256k1_ellswift_xdh_hash_function_bip324, NULL);
assert(return_val);
/* Both parties should end up with the same shared secret */
return_val = memcmp(shared_secret1, shared_secret2, sizeof(shared_secret1));
assert(return_val == 0);
printf( " Secret Key1: ");
print_hex(seckey1, sizeof(seckey1));
printf( "EllSwift Pubkey1: ");
print_hex(ellswift_pubkey1, sizeof(ellswift_pubkey1));
printf("\n Secret Key2: ");
print_hex(seckey2, sizeof(seckey2));
printf( "EllSwift Pubkey2: ");
print_hex(ellswift_pubkey2, sizeof(ellswift_pubkey2));
printf("\n Shared Secret: ");
print_hex(shared_secret1, sizeof(shared_secret1));
/* This will clear everything from the context and free the memory */
secp256k1_context_destroy(ctx);
/* It's best practice to try to clear secrets from memory after using them.
* This is done because some bugs can allow an attacker to leak memory, for
* example through "out of bounds" array access (see Heartbleed), or the OS
* swapping them to disk. Hence, we overwrite the secret key buffer with zeros.
*
* Here we are preventing these writes from being optimized out, as any good compiler
* will remove any writes that aren't used. */
secure_erase(seckey1, sizeof(seckey1));
secure_erase(seckey2, sizeof(seckey2));
secure_erase(shared_secret1, sizeof(shared_secret1));
secure_erase(shared_secret2, sizeof(shared_secret2));
return 0;
}

View File

@@ -95,7 +95,7 @@ static void secure_erase(void *ptr, size_t len) {
* As best as we can tell, this is sufficient to break any optimisations that
* might try to eliminate "superfluous" memsets.
* This method used in memzero_explicit() the Linux kernel, too. Its advantage is that it is
* pretty efficient, because the compiler can still implement the memset() efficently,
* pretty efficient, because the compiler can still implement the memset() efficiently,
* just not remove it entirely. See "Dead Store Elimination (Still) Considered Harmful" by
* Yang et al. (USENIX Security 2017) for more background.
*/

Some files were not shown because too many files have changed in this diff Show More