Compare commits

...

185 Commits
1.3.0 ... 1.5.0

Author SHA1 Message Date
Carl Hua
f00f263852 Set version to 1.5.0 2020-03-30 18:57:35 -04:00
Carl Hua
b54d672710 Set version to 1.5.0-rc3 2020-03-24 14:42:45 -05:00
Mark Travis
5047b89922 Allow non-root users to build packages. 2020-03-24 11:36:00 -07:00
seelabs
78ea75b116 Remove incorrect STAmount assert:
When computing rates for offers, an STAmount's value can be out of can be out of
range (before canonicalizing). There was an assert that could incorrectly fire
in some cases. This patch removes that assert.
2020-03-23 18:17:00 -07:00
Carl Hua
c0964832f5 Fix failing Travis CI Windows VS2019 build
The newest MSVC 19.25.28610.4 does not build rocksdb. During the
Travis CI Windows job, the vs_BuildTools.exe automatically
downloads the newest version of the compiler. This fix forces the
install of MSVC 19.24.28314.0 to build rocksdb.
2020-03-23 15:43:07 -07:00
Manoj doshi
4e3dc0e820 Set version to 1.5.0-rc2 2020-03-10 16:08:42 -07:00
CJ Cobb
2a2ad898b1 Set applied field to true even if engine result is not tesSUCCESS 2020-03-10 16:07:10 -07:00
Manoj doshi
b26ed948c8 Set version to 1.5.0-rc1 2020-03-06 15:12:51 -08:00
Manoj doshi
ba53722425 Set version to 1.5.0-b8 2020-03-06 15:10:49 -08:00
seelabs
4852daf695 Remove entry for fix1781 amendment:
The fix1781 amendment was incorrectly introduced during conflict
resolution and support for it is not included at this time. This commit
removes the definition of the amendment identifier.
2020-03-06 14:59:39 -08:00
Carl Hua
2d1be70ba2 Set version to 1.5.0-b7 2020-03-05 14:12:18 -05:00
seelabs
b126641f13 Clarify the code a comment applies to:
The `seq & 0xff` does not depend of `diff`, so does not belong in the block that
defines `diff`.
2020-03-05 14:10:31 -05:00
Nik Bougalis
6529d3e6f7 Use ephemeral validator public keys in lag ratchet:
A review of the lag ratchet code revealed that we were using
the long-term master public keys of trusted validators, when
we should have been using the ephemeral public keys instead.

As a result, the lag ratchet code would be effectively
inoperable.
2020-03-05 14:10:25 -05:00
Nik Bougalis
16c659b22c Simplify HashPrefix by converting to an enum 2020-03-05 14:10:20 -05:00
Mo Morsi
ec137044a0 Protocol Amendment: Always Require Fully-Canonical Signatures 2020-03-05 14:10:06 -05:00
John Freeman
053b6d9fd3 Include field name when diagnosing field not found:
Closes #3263
2020-03-05 14:08:07 -05:00
Mike Ellery
f76a5a3183 Set version to 1.5.0-b6
closes: #3226
closes: #3249
closes: #3260
closes: #3254
closes: #3258
closes: #3267
2020-02-25 19:22:00 -08:00
Elliot Lee
cfe59ee028 Fix typos in comments 2020-02-25 19:21:55 -08:00
John Freeman
20d4a39b63 Cooperate with CMAKE_MODULE_PATH set by Conan 2020-02-25 19:21:55 -08:00
CJ Cobb
e7ce3909d2 gRPC support for account_tx and tx
- Add support for all transaction types and ledger object types to gRPC
  implementation of tx and account_tx.

- Create common handlers for tx and account_tx.

- Remove mutex and abort() from gRPC server. JobQueue is stopped before
  gRPC server, with all coroutines executed to completion, so no need for
  synchronization.
2020-02-25 19:21:55 -08:00
Mike Ellery
acf4b78892 Add getRippledInfo to packages:
* include info gathering script in rippled linux pkgs
* modify script to output results to single markdown file
* rename info script
2020-02-25 19:21:55 -08:00
Mike Ellery
e9b3c58043 Use native cmake unity support:
* Remove hand-rolled unity sources
* unity unavailable for cmake 3.15 and earlier
* remove deprecated target configure option
2020-02-25 19:21:50 -08:00
seelabs
455105d3dc Limit account_channels to channels owned by the source account 2020-02-24 15:30:58 -08:00
seelabs
2dac108c57 Fix account_channels marker functionality 2020-02-24 15:30:58 -08:00
Nik Bougalis
0c6d380780 Set version to 1.5.0-b5 2020-02-12 11:30:45 -08:00
Edward Hennis
2c71802e38 Propagate validator lists (VLs or UNLs) over the peer network:
* Whenever a node downloads a new VL, send it to all peers that
  haven't already sent or received it. It also saves it to the
  database_dir as a Json text file named "cache." plus the public key of
  the list signer. Any files that exist for public keys provided in
  [validator_list_keys] will be loaded and processed if any download
  from [validator_list_sites] fails or no [validator_list_sites] are
  configured.
* Whenever a node receives a broadcast VL message, it treats it as if
  it had downloaded it on it's own, broadcasting to other peers as
  described above.
* Because nodes normally download the VL once every 5 minutes, a single
  node downloading a VL with an updated sequence number could
  potentially propagate across a large part of a well-connected network
  before any other nodes attempt to download, decreasing the amount of
  time that different parts of the network are using different VLs.
* Send all of our current valid VLs to new peers on connection.
  This is probably the "noisiest" part of this change, but will give
  poorly connected or poorly networked nodes the best chance of syncing
  quickly. Nodes which have no http(s) access configured or available
  can get a VL with no extra effort.
* Requests on the peer port to the /vl/<pubkey> endpoint will return
  that VL in the same JSON format as is used to download now, IF the
  node trusts and has a valid instance of that VL.
* Upgrade protocol version to 2.1. VLs will only be sent to 2.1 and
  higher nodes.
* Resolves #2953
2020-02-12 10:19:23 -08:00
Edward Hennis
3753840de5 Handle build directories for specific versions of gcc/clang:
* Example: gcc.Debug will use the the default version of gcc installed on the
  system. gcc-9.Debug will use version 9, regardless of the default. This will
  be most useful when the default is older than required or desired.
2020-02-12 10:19:23 -08:00
Edward Hennis
5ff23f8f31 Warn operators about upcoming unknown amendments:
* When an unknown amendment reaches majority, log an error-level
  message, and return a `warnings` array on all successful
  admin-level RPC calls to `server_info` and `server_state` with
  a message describing the problem, and the expected deadline.
* In addition to the `amendment_blocked` flag returned by
  `server_info` and `server_state`, return a warning with a more
  verbose description when the server is amendment blocked.
* Check on every flag ledger to see if the amendment(s) lose majority.
  Logs again if they don't, resumes normal operations if they did.

The intention is to give operators earlier warning that their
instances are in danger of being amendment blocked, which will
hopefully motivate them to update ahead of time.
2020-02-12 10:19:23 -08:00
Mike Ellery
4315913a5d Update RocksDB to 6.5:
* update EP and find package requirements
* minor protobuf/libarchive build fixes
* change travis release builds to nounity to
  ameliorate vm memory exhaustion.

FIXES: #3223, #3232
2020-02-12 10:19:23 -08:00
mbhandary
facb627786 Improve reporting of StateAccounting metrics:
* Metrics are now exported over insight.
* Fixes a minor bug that affected the reporting of gauges
2020-02-12 10:19:23 -08:00
mbhandary
b784988caf Added support for statsD Traffic Counts reporting 2020-02-12 10:19:23 -08:00
Nik Bougalis
172ead8221 Remove unused 'old Beast' code (RIPD-1652) 2020-02-11 19:14:24 -08:00
p2peer
d224d7e404 Switch to Boost.Beast for SSL detection (#3166) 2020-02-11 19:14:24 -08:00
p2peer
7ea78c8517 Remove workaround for waitable timers (#3166) 2020-02-11 19:14:24 -08:00
Howard Hinnant
ce589225dd Add time zone abbreviation to all time point streaming 2020-02-11 19:14:24 -08:00
Howard Hinnant
79896af275 Qualify tolower with std:: and remove obsolete comments
* Fixes RIPD-1759
2020-02-11 19:14:24 -08:00
Scott Schurr
2831ed0538 Add unittests for peer_reservations command line interface 2020-02-11 19:14:24 -08:00
Mo Morsi
e24dd2f954 Fix broken tests (missing api_version parameter) 2020-02-11 19:14:24 -08:00
Mo Morsi
60f0f5224d Accept 'strict' param in certain CLI options
account_info, owner_info, account_currencies
2020-02-10 23:36:31 -08:00
Mo Morsi
3578acaf0b Add new validator_info and manifest rpc methods
Returns local validator details and specified manifest information
respectively. Folded and rebased on latest develop
2020-02-10 23:36:31 -08:00
Carl Hua
2f9edf495e Add issue templates 2020-02-10 23:36:31 -08:00
Manoj doshi
cc4cefaa4b Set version to 1.5.0-b4 2020-01-30 15:56:43 -08:00
seelabs
81326a6d08 Remove old payment code 2020-01-30 15:38:14 -08:00
seelabs
9d3626fec5 Fix bug in qualityUpperBound:
* In and Out parameters were swapped when calculating the rate
* In and out qualities were not calculated correctly; use existing functions
  to get the qualities
* Added tests to check that theoretical quality matches actual computed quality
* Remove in/out parameter from qualityUpperBound
* Rename an overload of qualityUpperBound to adjustQualityWithFees
* Add fix amendment
2020-01-30 13:35:52 -08:00
Scott Schurr
ae707b814f Always enable fix1449 dated March 30, 2017 20:00:00 UTC 2020-01-30 13:22:54 -08:00
Scott Schurr
9c580b20a3 Always enable fix1443 dated March 12, 2017 01:00:00 UTC 2020-01-30 13:22:49 -08:00
Scott Schurr
70c21f90b9 Always enable fix1298 dated December 21, 2016 18:00:00 UTC 2020-01-30 13:22:43 -08:00
Scott Schurr
0964379a66 Always enable fix1274 dated September 30, 2016 17:00:00 UTC 2020-01-30 13:22:34 -08:00
Scott Schurr
a176f58a92 Always enable fix1141 dated July 1, 2016 17:00:00 UTC 2020-01-30 13:22:26 -08:00
Scott Schurr
fc0a082700 Remove STAmountSO::soTime and soTime2:
STAmount::soTime and soTime2 were time based "amendment like"
switches to control small changes in behavior for STAmount.
soTime2, which was the most recent, was dated Feb 27, 2016.
That's over 3 years ago.

The main reason to retain these soTimes would be to replay
old transactions.  The likelihood of needing to replay a
transaction from over three years ago is pretty low.  So it
makes sense to remove these soTime values.

In Flow_test the testZeroOutputStep() test is removed.  That
test started to fail when the STAmount::soTimes were removed.
I checked with the original author of the test.  He said
that the code being tested by that unit test has been removed,
so it makes sense to remove the test.  That test is removed.
2020-01-30 13:22:19 -08:00
Scott Schurr
f7fffee28d Warn on replay of ledgers from before Jan 1 2018 2020-01-30 13:22:08 -08:00
Scott Schurr
51ed7db002 Remove conditionals for featureTrustSetAuth enabled 19Jul2016 2020-01-30 13:20:01 -08:00
Scott Schurr
6e4945c56b Remove conditionals for featureMultiSign enabled 27Jun2016 2020-01-30 13:19:55 -08:00
Scott Schurr
c48be14f4f Remove comments about featureFeeEscalation enabled 19May2016 2020-01-30 13:19:48 -08:00
Jeroen Meulemeester
ecbbabfc3c Reduce log level of xrp/xrp offer payment step
This fixes #3181
2020-01-30 13:18:44 -08:00
Mark Travis
687cdbb78b Set start/stop logging for online_delete to appropriate verbosity level. 2020-01-30 13:18:35 -08:00
Mike Ellery
098086592b Resolve minor build issues:
- allow SOCI EP to build with vcpkg
 - fix container tagging in pkg build
 - correct libarchive name for win
2020-01-30 13:18:18 -08:00
Carl Hua
90d9ca901d Set version to 1.5.0-b3 2020-01-12 13:02:17 -05:00
Mike Ellery
eb016456a1 Streamline pkg and travis CI:
* use tagged containers for pkg build
* update build images
* continue to build container images in pipeline, but allow
  failure (non-block)
* limit travis macos cache
* add vs2019 windows to travis
* remove xcode 9 travis build
* remove clang5/6 from CI and update min version of Clang required in
  cmake
* break windows CI build into stages to reduce timeouts
* update datelib
* add if condition to travis builds to allow commit message to limit
  builds by platform
2020-01-12 07:26:19 -08:00
Devon White
cd9732b47a Change how fail_hard transactions are handled.
FIXES: #2847

* Transactions that are submitted with the fail_hard flag
  and that result in any TER code besides tesSUCCESS shall
  be neither queued nor held.

[FOLD] Keep tec results out of the open ledger when fail_hard:

* Improve TransactionStatus const correctness, and remove redundant
  `local` check
* Check open ledger tx count in fail_hard tests
* Fix some wrapping
* Remove duplicate test
2020-01-10 12:40:31 -08:00
CJ Cobb
7d867b806d Add gRPC support (#3127):
* add support for AccountInfo, Fee and Submit RPCs

* add partial support for Tx RPC (only supports Payments)
2020-01-10 12:31:24 -08:00
seelabs
761bb5744e Make XRPAmount constructor explicit:
Remove the implicit conversion from int64 to XRPAmount. The motivation for this
was noticing that many calls to `to_string` with an integer parameter type were
calling the wrong `to_string` function. Since the calls were not prefixed with
`std::`, and there is no ADL to call `std::to_string`, this was converting the
int to an `XRPAmount` and calling `to_string(XRPAmount)`.

Since `to_string(XRPAmount)` did the same thing as `to_string(int)` this error
went undetected.
2020-01-08 16:10:25 -08:00
Edward Hennis
e3b5b808c5 Add units to all fee calculations:
* Uses existing XRPAmount with units for drops, and a new TaggedFee for
  fee units (LoadFeeTrack), and fee levels (TxQ).
* Resolves #2451
2020-01-08 18:44:01 -05:00
Edward Hennis
1901b981f3 Convert tuning params to constexpr 2020-01-08 17:58:47 -05:00
Nik Bougalis
c624fc9b17 Set version to 1.5.0-b2 2020-01-02 13:29:22 -08:00
mbhandary
9294a07362 Report the validated and publish ledger ages over insight 2020-01-01 18:12:55 -08:00
mbhandary
9fcc30df89 Improve insight reporting of job queue timing:
Prior to this commit, the queue and execution times for individual jobs
were reported indepedently and could, potentially, be out of sync. This
change reports both values when either one of the exceeds the reporting
threshold.
2020-01-01 18:12:55 -08:00
Elliot Lee
1f5d9404d0 Update comments in TER.h 2020-01-01 18:12:55 -08:00
seelabs
5096cc1f5f Increase the deadlock detection timeout:
It's possible an overloaded job queue is causing false alarms on the deadlock
detector. Log a fatal message after 90s, declare a logic error after 600s.
2020-01-01 18:12:55 -08:00
Peng Wang
2aa11fa41d Support API versioning 2020-01-01 18:12:55 -08:00
p2peer
79e9085dd1 Augment "submit" command response:
If merged, this commit will report additional information in the
response to the submit command; this will make it easier for developers
to accurately track the status of transaction submission.

Fixes #2851
2020-01-01 18:12:38 -08:00
Mike Ellery
14f0234a26 Allow trailing comments in config file:
Treat all `#` characters in config files as comments (and remove)
*unless* the `#` is immediately preceded by `\`. Write a warning
to log file when trailing comments are found/ignored in the config
to let operators know that the treatment of trailing `#` has changed.

Fixes #3121
2020-01-01 18:12:38 -08:00
Howard Hinnant
a65c91a676 Backwards compatible workaround for boost 1.72 2019-12-30 20:20:34 -08:00
Nik Bougalis
607328e1a0 Improve the 'network_id' configuration option:
The 'network_id' option allows an administrator to specify to which
network they intend a server to connect. Servers can leverage this
information to optimize routing and prune automatically discovered
cross-network connections.

This commit will, if merged:

- add support for the devnet keyword, which corresponds to network ID #2;
- report the network ID, if one is configured, in server_info
2019-12-30 20:20:34 -08:00
Nik Bougalis
a96dc2ecea Remove unused configuration option 2019-12-30 20:20:33 -08:00
Nik Bougalis
3a77990781 Improve automatic I/O thread tuning algorithm 2019-12-30 20:20:33 -08:00
Nik Bougalis
4bb951d48e Fix node auto-configuration code:
The `node_size` configuration option is used to automatically
configure various parameters (cache sizes, timeouts, etc) for
the server.

A previous commit included changes that caused incorrect values
to be returned which can result in sub-optimal performance that
can manifest as difficulty syncing to the network, or increased
disk I/O and/or memory usage. The problem was introduced with
commit 66fad62e66.

This commit, if merged, fixes the code to ensure that the correct
values are returned and introduces a compile-time check to prevent
this issue from reoccurring.
2019-12-30 20:20:32 -08:00
Nik Bougalis
63503ee8f0 Improve platform detection and reduce includes:
The existing platform detection code was derived from the old Beast
library, which was, itself, derived from JUCE.

This commit removes that code and replaces it with the Boost.Predef
library which defines a consistent set of compiler, architecture,
operating system, library, and other version numbers.

For more on Boost.Predef, please see the Boost documentation. The
documentation for the current version as of this writing is at:
https://www.boost.org/doc/libs/1_71_0/doc/html/predef.html
2019-12-30 20:20:31 -08:00
Nik Bougalis
4a1148eb28 Set version to 1.5.0-b1 2019-11-28 09:46:20 -08:00
Nik Bougalis
f6916bfd42 Improve protocol-level handshaking protocol:
This commit restructures the HTTP based protocol negotiation that `rippled`
executes and introduces support for negotiation of compression for peer
links which, if implemented, should result in significant bandwidth savings
for some server roles.

This commit also introduces the new `[network_id]` configuration option
that administrators can use to specify which network the server is part of
and intends to join. This makes it possible for servers from different
networks to drop the link early.

The changeset also improves the log messages generated when negotiation
of a peer link upgrade fails. In the past, no useful information would
be logged, making it more difficult for admins to troubleshoot errors.

This commit also fixes RIPD-237 and RIPD-451
2019-11-28 09:46:17 -08:00
Scott Schurr
3ea525430e Improve Json::Value:
o Increase test coverage.
o Remove unused and/or broken interfaces.
2019-11-27 17:28:15 -08:00
John Northrup
fb0d065723 Move signing of deb to use Ubuntu 18.04 LTS 2019-11-27 17:28:15 -08:00
Devon White
47501b7f99 Provide additional info with txnNotFound errors.
* The `tx` command now supports min_ledger and max_ledger fields.
* If the requested transaction isn't found and these fields are
  provided, the error response indicates whether or not every
  ledger in the the provided range was searched.

This fixes #2924
2019-11-27 17:28:15 -08:00
ShangyanLi
11cf27e006 Skip validity check for tx command:
* historical tx retrieval no longer needs to pass current validity check;
* introduced additional RPC error code for database deserialization error.

This fixes issue #2597
2019-11-27 17:28:15 -08:00
Mark Travis
ade1afe1b0 Support multiple proxies in X-Forwarded-For header 2019-11-27 17:28:15 -08:00
Mo Morsi
6cda070fe0 Search for soci before vendoring 2019-11-27 16:58:56 -08:00
Mo Morsi
9bd470f2c7 Search for sqlite3 before vendoring 2019-11-27 16:58:56 -08:00
Mo Morsi
530f1939d6 Search for system snappy before vendoring 2019-11-27 16:58:56 -08:00
Mo Morsi
e4ea3752ac Search for system secp256k1 before vendoring 2019-11-27 16:58:56 -08:00
Mo Morsi
b728bf0d09 Search for system libarchive before vendoring 2019-11-27 16:58:56 -08:00
Mo Morsi
194fb2b86d Use find_package to search for system lz4 before vendoring 2019-11-27 16:58:56 -08:00
Mike Ellery
ce5d901e6e Switch to official date lib repo for date.h 2019-11-27 16:58:56 -08:00
Howard Hinnant
14075630d4 Remove per-process hash seeding for gcc.
* gcc now seeds hardened_hash per instance.

* Other platforms were already doing this.
2019-11-27 16:58:56 -08:00
Miguel Portilla
5c1dd87fab Make class members journal const 2019-11-27 16:58:56 -08:00
Nik Bougalis
06c371544a Set version to 1.4.0 2019-11-26 10:35:54 -08:00
Nik Bougalis
ffa0f048f3 Set version to 1.4.0-rc2 2019-11-25 21:51:54 -08:00
Mike Ellery
22ca0041b8 Use signed artifacts in pkg pipeline 2019-11-25 21:51:12 -08:00
Manoj doshi
232975bfdb Set version to 1.4.0-rc1 2019-11-04 11:45:31 -08:00
Howard Hinnant
4e84868747 Document the minimum GCC version as 7 2019-11-04 11:39:19 -08:00
John Northrup
9e1ccb900e GPG Sign DEB and RPM packages generated by build pipeline (#3144)
* adding package signing steps for rpm and deb

* first spike at GPG signing with CI and containers

* refine ubuntu portion

* get correct gpg package version

* adding CentOS support

* fixing errors in installing gpg on ubuntu

* base64 decode the GPG key

* fixing line continuations

* revised package signing, looking for package artifacts

* add dpkg-sig to ubuntu image

* sign all deb packges

* add passphrase to GPG process

* repeat yo slef on dpkg

* sign all the rpm packages too

* install rpm-sign in the CentOS docker image

* loop through rpm files

* no need for PIN on GPG signing
2019-11-03 12:08:40 -06:00
Manoj doshi
98d55b988f Set version to 1.4.0-b8 2019-10-30 12:24:25 -07:00
Mike Ellery
e720a66076 Ensure permissions for initial config 2019-10-30 12:23:58 -07:00
Joseph Busch
7be7343e05 Make the HTTP response size const 2019-10-30 12:23:58 -07:00
seelabs
e26dd7bdfe Reject overlong encodings earlier 2019-10-30 12:23:58 -07:00
seelabs
906b9ae00b Don't use set in AccountObjects test:
Collecting the returned and expected values in sets only works if there are no
duplicates. The implementation is changed to use sorted vectors to fix this case.
2019-10-30 12:23:58 -07:00
Mo Morsi
15c5f9c111 Report consensus phase changes in the server subscription stream 2019-10-30 12:23:57 -07:00
Howard Hinnant
726dd69ab9 Make Env::AppBundle constructor exception safe
When the Env::AppBundle constructor throws an exception
it still needs to run ~AppBundle(), otherwise the JobQueue
isn't properly shut down.  Specifically the  JobQueue
can destruct without waiting on outstanding jobs in the
queue.

This change ensures that if Env::AppBundle constructor
throws, Env::AppBundle::~AppBundle() runs.

This fixes the unit test crash exposed by PR #3047.
2019-10-30 12:23:57 -07:00
Manoj doshi
41b2c80dde Set version to 1.4.0-b7 2019-10-18 16:45:06 -07:00
Mike Ellery
cd01502d17 Relax STTx test failure criterion:
FIXES: #3106

Different versions of protobuf produce subtly different
results when given invalid message payloads. This leads to
subtly different behavior when we try to deserialize these
invalid messages. As such, we can't tie success to a
particular exception.
2019-10-18 16:44:16 -07:00
Mike Ellery
b2317f8b41 Omit downloader resolve test when it won't fail
Fixes: #3108
2019-10-18 16:44:16 -07:00
Nik Bougalis
113167acf4 Allow channel_authorize to use Ed25519 keys 2019-10-18 16:44:16 -07:00
Nik Bougalis
a3a9dc26b4 Introduce support for deletable accounts:
The XRP Ledger utilizes an account model. Unlike systems based on a UTXO
model, XRP Ledger accounts are first-class objects. This design choice
allows the XRP Ledger to offer rich functionality, including the ability
to own objects (offers, escrows, checks, signer lists) as well as other
advanced features, such as key rotation and configurable multi-signing
without needing to change a destination address.

The trade-off is that accounts must be stored on ledger. The XRP Ledger
applies reserve requirements, in XRP, to protect the shared global ledger
from growing excessively large as the result of spam or malicious usage.

Prior to this commit, accounts had been permanent objects; once created,
they could never be deleted.

This commit introduces a new amendment "DeletableAccounts" which, if
enabled, will allow account objects to be deleted by executing the new
"AccountDelete" transaction. Any funds remaining in the account will
be transferred to an account specified in the deletion transaction.

The amendment changes the mechanics of account creation; previously
a new account would have an initial sequence number of 1. Accounts
created after the amendment will have an initial sequence number that
is equal to the ledger in which the account was created.

Accounts can only be deleted if they are not associated with any
obligations (like RippleStates, Escrows, or PayChannels) and if the
current ledger sequence number exceeds the account's sequence number
by at least 256 so that, if recreated, the account can be protected
from transaction replay.
2019-10-18 16:44:16 -07:00
Joseph Busch
7e7664c29a Add deletion_blockers_only param to account_objects RPC command 2019-10-18 14:18:38 -07:00
Manoj doshi
fccb7e1c70 Set version to 1.4.0-b6 2019-10-15 12:02:12 -07:00
Howard Hinnant
3f45b8c3bd Switch deadlock detector to steady_clock
* Changes to system time should not trigger the deadlock detector.
* Fixes #3101
2019-10-15 12:01:37 -07:00
seelabs
ca6d5798e9 Support for boost 1.71:
* replace boost::beast::detail::iequals with boost::iequals
* replace deprecated `buffers` function with `make_printable`
* replace boost::beast::detail::ascii_tolower with lambda
* add missing includes
2019-10-15 12:01:37 -07:00
Mike Ellery
2110b24090 Add omitted unit tests, cleanup old files 2019-10-15 12:01:37 -07:00
Mike Ellery
82484e26f5 Add option to enable -Wextra for gcc/clang. 2019-10-15 12:01:37 -07:00
Scott Schurr
ca47583a3b Small bug fixes in BuildLedger.cpp 2019-10-15 12:01:37 -07:00
Joseph Busch
f4d6b0e1c4 Add metrics for PeerImp to track bandwidth usage 2019-10-15 12:01:37 -07:00
Devon White
9196d9541a Include validator's master public key in validation stream:
The validation stream only reported the ephemeral signing key for validators
which use manifests. This made tracking unnecessarily difficult for clients
processing the data stream.

With this change, the validator's long-term master public key is also
included.

This commit fixes #3005

* Provide proposing validator's master key in the validation stream
  subscription JSON responses.

Implement code review changes.

FIXES: #3005
2019-10-15 12:01:37 -07:00
Manoj doshi
b53fda1e1a Set version to 1.4.0-b5 2019-09-27 12:25:21 -07:00
Mike Ellery
9a5911f94c CI: remove boost from macos/hombrew 2019-09-27 12:24:19 -07:00
Mike Ellery
80acc85e59 Fix startup error with --import 2019-09-27 12:24:19 -07:00
Joseph Busch
8ce1c189f7 Fix VS2019 debug build 2019-09-27 12:24:19 -07:00
Howard Hinnant
7228b2e068 Remove SHAMap V2 2019-09-27 12:24:19 -07:00
Yusuf Sahin HAMZA
33ab0cd7bd Automatically update NodeDB path if changed 2019-09-27 12:24:19 -07:00
seelabs
e33ac1d450 Add PayChan to recipient's owner directory 2019-09-27 11:35:22 -07:00
Rome Reginelli
826cbbc3bf Clarify Linux build instructions for configuring
* Explain that Arch/Manjaro/etc. need `-Dstatic=OFF` during the configure step
* move configuration options closer to that step
* separate sub-headers for configuration and build
2019-09-27 11:35:22 -07:00
Mike Ellery
38f82115f7 Set version to 1.4.0-b4 2019-09-09 18:46:06 -07:00
Mike Ellery
d9bb78c8b8 Cleanup usage of ScopedUnlock 2019-09-09 18:45:56 -07:00
Mo Morsi
905f631b71 Modularize cmake build config 2019-09-09 18:45:53 -07:00
Mike Ellery
9213c49ca1 Honor SSL config settings for ValidatorSites:
FIXES: #2990

* refactor common SSL client setup
* enable SSL in unit-test http server
* add tests for SSLHTTPDownloader
* misc test refactoring
2019-09-09 10:55:31 -07:00
Nik Bougalis
fc7ecd672a Set version to 1.4.0-b3 2019-09-07 12:08:47 -07:00
Mark Travis
a6944be5cf Clarify online delete data error message. 2019-09-07 11:44:00 -07:00
Mark Travis
e5b61c9ac9 Update operating mode upon network disagreement. 2019-09-07 11:44:00 -07:00
Scott Schurr
a9a4e2c8fb Add Destination to Check threading 2019-09-07 11:39:02 -07:00
Miguel Portilla
56eac5c9a1 Fix rand_int assert in shard store 2019-09-07 11:39:02 -07:00
Miguel Portilla
4b1970afa9 Log database connection error 2019-09-07 11:39:02 -07:00
Miguel Portilla
22c9de487a Add shard thread safety 2019-09-07 11:39:02 -07:00
Miguel Portilla
66fad62e66 Implement Shard SQLite support 2019-09-07 11:39:02 -07:00
Mike Ellery
008fc5155a Improve CI and packaging:
* add travis build for min cmake supported
* add travis build for validator keys (uses xrpl_core)
* add travis build for ipv6 (mac only)
* add cmake target for validator keys via FetchContent
* use validator keys target in package build
2019-09-07 11:39:02 -07:00
seelabs
5834fbbc5d Set version to 1.4.0-b2 2019-08-23 11:33:59 -07:00
seelabs
06219f1151 Disable shadow warning:
Different compilers and handling the shadow warning differently. In particular,
some are warning about types being shadowed by variables. Until these can be
resolved the shadow warning is being disabled.

Note: the shadow warning was originally enabled to help with the structured
bindings patch. As that is now complete, it's less important to keep this
warning.
2019-08-23 11:33:59 -07:00
seelabs
c2d2ba9f45 Simplify code using if constexpr:
Also simplify msig construction
2019-08-23 11:33:59 -07:00
seelabs
1eb3753f26 Replace from_string_checked pair return type with optional<Endpoint> 2019-08-23 11:33:59 -07:00
seelabs
0a256247a0 Replace strUnHex pair return type with optional<Blob> 2019-08-23 11:33:59 -07:00
seelabs
7912ee6f7b Use structured bindings in some places:
Most of the new uses either:
* Replace some uses of `tie`
* bind to pairs when iterating through maps
2019-08-23 11:33:59 -07:00
seelabs
9c58f23cf8 Replace unordered_map::emplace with insert_or_assign 2019-08-23 08:47:43 -07:00
seelabs
9245b0b666 Use std::gcd to implement lowestTerms 2019-08-23 08:47:43 -07:00
seelabs
92925a0af6 Use [[fallthrough]] in some switch statements 2019-08-23 08:47:43 -07:00
seelabs
70c2e1b419 remove make_Amounts:
* c++-17's class template type deduction can replace this function
2019-08-23 08:47:43 -07:00
seelabs
5d1728cc96 Use class template argument deduction for locks 2019-08-23 08:47:43 -07:00
seelabs
4076b6d92e Replace for_each_arg trick with fold expressions 2019-08-23 08:47:42 -07:00
seelabs
b9e73b4852 Fix shadowing variables 2019-08-23 08:47:42 -07:00
seelabs
014df67fed Remove unused member variable 2019-08-23 08:47:35 -07:00
Mike Ellery
014ec021bb Build packages with gcc-8 2019-08-23 08:42:14 -07:00
Mike Ellery
7fa9b91d23 Fix jenkins/travis CI:
* remove clang builds from jenkins
* disable windows travis cache
* limit make parallelism
* update linux CI image
2019-08-23 08:41:49 -07:00
Nik Bougalis
98c4a7a2b1 Set version to 1.4.0-b1 2019-08-19 06:58:55 -07:00
Vishwas Patil
c04c00d279 Add "sahyadri.isrdc.in" to list of bootstrap nodes 2019-08-19 06:58:50 -07:00
Elliot Lee
ef139e8b3c Update README.md
- This fixes #3023; thanks @crypto-deus.
- This fixes #3026; thanks @William-Gomez.
2019-08-19 06:58:50 -07:00
Alloy Networks
45c1c38993 Warn if the update script is not executed as root 2019-08-19 06:58:50 -07:00
Mo Morsi
1942fee581 Modernize code and clean up out-of-date or obsolete comments:
- Remove references to nodestore ledger index. This was removed
  in f946d7b447.
2019-08-19 06:58:50 -07:00
seelabs
b3c85e2709 Remove unused TER code from StrandResult 2019-08-19 06:58:50 -07:00
seelabs
561942da23 Use enums for some parameters in payments:
* Use enums for StrandDirection, DebtDirection, and QualityDirection
2019-08-19 06:58:50 -07:00
seelabs
c217baa367 Enable C++17 2019-08-19 06:58:50 -07:00
Howard Hinnant
c699864c85 Fix incorrect snapShot unsharing
This fixes #3020.
2019-08-19 06:58:50 -07:00
Howard Hinnant
4ff0f482c3 Remove unneeded and unused base classes in insight 2019-08-19 06:58:50 -07:00
Scott Schurr
28b942b186 Enhance AccountTx unit test 2019-08-19 06:58:50 -07:00
Edward Hennis
4900e3081d Update appveyor dependencies for boost 1.70 2019-08-19 06:58:50 -07:00
Mike Ellery
145326c00c Add policy check to FindBoost 2019-08-19 06:58:50 -07:00
Mike Ellery
c7d90bfddd Update to current SOCI HEAD 2019-08-16 10:33:08 -07:00
Mike Ellery
bfa84cfca5 Fix io_latency_probe test on CI environments 2019-08-16 10:33:08 -07:00
Mike Ellery
cbc6e500b6 Set minimum versions for gcc/clang 2019-08-16 10:33:08 -07:00
Mike Ellery
13a4fefe34 Travis CI improvements:
FIXES: #2527

* define custom docker image for travis-linux builds based on
  package build image
* add macos builds
* add windows builds (currently allowed to fail)
* improve build and shell scripts as required for the CI envs
* add asio timer latency workaround
* omit several manual tests from TravisCI which cause memory exhaustion
2019-08-16 10:33:08 -07:00
John Freeman
87e9ee5ce9 Add support for reserved peer slots:
This commit allows server operators to reserve slots for specific
peers (identified by the peer's public node identity) and to make
changes to the reservations while the server is operating.

This commit closes #2938
2019-08-05 17:46:24 -07:00
John Freeman
20cc5df5fe Fix GitLab CI
- Update Docker image to Boost 1.70
- Bust dependency cache
- Pass `Boost_NO_BOOST_CMAKE` to CMake
2019-08-05 09:49:43 -07:00
Miguel Portilla
9e117b7b38 Fix shard path detection 2019-08-04 20:48:08 -07:00
Miguel Portilla
a02d914093 Move shard store init to Application 2019-08-04 20:48:08 -07:00
Miguel Portilla
c5a95f1eb5 Remove SQLite Validations table 2019-08-04 20:01:34 -07:00
Manoj doshi
e1adbd7ddd Set version to 1.3.1 2019-07-24 15:22:24 -07:00
Joseph Busch
355a7b04a8 Add a LogicError when a deadlock is detected 2019-07-24 14:08:47 -07:00
Lazaridis
d3ee0df93a Add links to in-repo build-instructions 2019-07-24 14:03:44 -07:00
Mark Travis
7c24f7b170 Improve logging during process startup. 2019-07-24 13:55:38 -07:00
Mike Ellery
a3060516c6 Improve package build pipeline:
- Add docker container tags for "latest_BRANCH"
- Prevent different branches from overwriting deb repo artifacts
- Manual approval always required before pushing to prod
2019-07-24 13:53:09 -07:00
760 changed files with 36343 additions and 32928 deletions

View File

@@ -1,6 +1,5 @@
codecov:
ci:
- ci.ops.ripple.com # add custom jenkins server
- !appveyor
- !travis
- travis

31
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,31 @@
---
name: Bug Report
about: Create a report to help us improve rippled
title: "[Title with short description] (Version: [rippled version])"
labels: ''
assignees: ''
---
<!-- Please search existing issues to avoid creating duplicates.-->
## Issue Description
<!--Provide a summary for your issue/bug.-->
## Steps to Reproduce
<!--List in detail the exact steps to reproduce the unexpected behavior of the software.-->
## Expected Result
<!--Explain in detail what behavior you expected to happen.-->
## Actual Result
<!--Explain in detail what behavior actually happened.-->
## Environment
<!--Please describe your environment setup (such as Ubuntu 18.04 with Boost 1.70).-->
<!-- If you are using a formal release, please use the version returned by './rippled --version' as the verison number-->
<!-- If you are working off of develop, please add the git hash via 'git rev-parse HEAD'-->
## Supporting Files
<!--If you have supporting files such as a log, feel free to post a link here using Github Gist.-->
<!--Consider adding configuration files with private information removed via Github Gist. -->

11
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@@ -0,0 +1,11 @@
blank_issues_enabled: false
contact_links:
- name: XRP Ledger Documentation
url: https://xrpl.org/
about: All things about XRPL
- name: General question for the community
url: https://forum.xpring.io/c/community/
about: Please ask and answer questions here.
- name: Security bug bounty program
url: https://ripple.com/bug-bounty/
about: Please report security-relevant bugs in our software here.

View File

@@ -0,0 +1,21 @@
---
name: Feature Request
about: Suggest a new feature for the rippled project
title: "[Title with short description] (Version: [rippled version])"
labels: Feature Request
assignees: ''
---
<!-- Please search existing issues to avoid creating duplicates.-->
## Summary
<!-- Provide a summary to the feature request-->
## Motivation
<!-- Why do we need this feature?-->
## Solution
<!-- What is the solution?-->
## Paths Not Taken
<!-- What other alternatives have been considered?-->

4
.gitignore vendored
View File

@@ -25,6 +25,9 @@ build
.nih_c
tags
TAGS
GTAGS
GRTAGS
GPATH
bin/rippled
Debug/*.*
Release/*.*
@@ -94,3 +97,4 @@ Builds/VisualStudio2015/*.sdf
.vs/
CMakeSettings.json
compile_commands.json
.clangd

View File

@@ -18,7 +18,7 @@
tags:
- linux
- c5.2xlarge
image: thejohnfreeman/rippled-build-ubuntu:1bc7230e5b97
image: thejohnfreeman/rippled-build-ubuntu:4b73694e07f0
script:
- bin/ci/build.sh
- bin/ci/test.sh
@@ -41,7 +41,7 @@
COMPILER: gcc
BUILD_TYPE: Debug
cache:
key: ae412fb0-5fc8-4fd5-8e48-e3a60e91ed46
key: 62ada41c-fc9e-4949-9533-736d4d6512b6
policy: pull-push
'build+test Ninja GCC Debug':
@@ -51,7 +51,7 @@
COMPILER: gcc
BUILD_TYPE: Debug
cache:
key: 3a314872-4faa-4712-b8ab-a8d54ac83342
key: 1665d3eb-6233-4eef-9f57-172636899faa
policy: pull-push
'build+test Ninja GCC Debug -Dstatic=OFF':
@@ -62,7 +62,7 @@
BUILD_TYPE: Debug
CMAKE_ARGS: '-Dstatic=OFF'
cache:
key: 3a314872-4faa-4712-b8ab-a8d54ac83342
key: 1665d3eb-6233-4eef-9f57-172636899faa
'build+test Ninja GCC Debug -Dstatic=OFF -DBUILD_SHARED_LIBS=ON':
extends: .job_linux_build_test
@@ -72,7 +72,7 @@
BUILD_TYPE: Debug
CMAKE_ARGS: '-Dstatic=OFF -DBUILD_SHARED_LIBS=ON'
cache:
key: 3a314872-4faa-4712-b8ab-a8d54ac83342
key: 1665d3eb-6233-4eef-9f57-172636899faa
'build+test Ninja GCC Debug -Dunity=OFF':
extends: .job_linux_build_test
@@ -82,7 +82,7 @@
BUILD_TYPE: Debug
CMAKE_ARGS: '-Dunity=OFF'
cache:
key: 3a314872-4faa-4712-b8ab-a8d54ac83342
key: 1665d3eb-6233-4eef-9f57-172636899faa
'build+test Ninja GCC Release -Dassert=ON':
extends: .job_linux_build_test
@@ -92,7 +92,7 @@
BUILD_TYPE: Release
CMAKE_ARGS: '-Dassert=ON'
cache:
key: ac5e7a8a-bfcb-4480-a3cc-07d0336f4408
key: c45ec125-9625-4c19-acf7-4e889d5f90bd
policy: pull-push
'build+test(manual) Ninja GCC Release -Dassert=ON':
@@ -104,7 +104,7 @@
CMAKE_ARGS: '-Dassert=ON'
MANUAL_TEST: 'true'
cache:
key: ac5e7a8a-bfcb-4480-a3cc-07d0336f4408
key: c45ec125-9625-4c19-acf7-4e889d5f90bd
'build+test Make clang Debug':
extends: .job_linux_build_test
@@ -113,7 +113,7 @@
COMPILER: clang
BUILD_TYPE: Debug
cache:
key: 09cec2ce-83f5-4ce9-8a3b-23737edefd4f
key: bf578dc2-5277-4580-8de5-6b9523118b19
policy: pull-push
'build+test Ninja clang Debug':
@@ -123,7 +123,7 @@
COMPILER: clang
BUILD_TYPE: Debug
cache:
key: c6e29541-e539-4d57-86c5-57d923521f35
key: 762514c5-3d4c-4c7c-8da2-2df9d8839cbe
policy: pull-push
'build+test Ninja clang Debug -Dunity=OFF':
@@ -134,7 +134,7 @@
BUILD_TYPE: Debug
CMAKE_ARGS: '-Dunity=OFF'
cache:
key: c6e29541-e539-4d57-86c5-57d923521f35
key: 762514c5-3d4c-4c7c-8da2-2df9d8839cbe
'build+test Ninja clang Debug -Dunity=OFF -Dsan=address':
extends: .job_linux_build_test
@@ -145,7 +145,7 @@
CMAKE_ARGS: '-Dunity=OFF -Dsan=address'
CONCURRENT_TESTS: 1
cache:
key: c6e29541-e539-4d57-86c5-57d923521f35
key: 762514c5-3d4c-4c7c-8da2-2df9d8839cbe
'build+test Ninja clang Debug -Dunity=OFF -Dsan=undefined':
extends: .job_linux_build_test
@@ -155,7 +155,7 @@
BUILD_TYPE: Debug
CMAKE_ARGS: '-Dunity=OFF -Dsan=undefined'
cache:
key: c6e29541-e539-4d57-86c5-57d923521f35
key: 762514c5-3d4c-4c7c-8da2-2df9d8839cbe
'build+test Ninja clang Release -Dassert=ON':
extends: .job_linux_build_test
@@ -165,5 +165,5 @@
BUILD_TYPE: Release
CMAKE_ARGS: '-Dassert=ON'
cache:
key: 2e48f4d8-e0d7-4aa9-b646-499cef6c1a87
key: 7751be37-2358-4f08-b1d0-7e72e0ad266d
policy: pull-push

View File

@@ -1,62 +1,424 @@
sudo: false
language: cpp
dist: xenial
services:
- docker
stages:
- one
- two
- three
- four
- five
env:
global:
# Maintenance note: to move to a new version
# of boost, update both BOOST_ROOT and BOOST_URL.
# Note that for simplicity, BOOST_ROOT's final
# namepart must match the folder name internal
# to boost's .tar.gz.
- LCOV_ROOT=$HOME/lcov
- GDB_ROOT=$HOME/gdb
- BOOST_ROOT=$HOME/boost_1_70_0
- BOOST_URL='http://sourceforge.net/projects/boost/files/boost/1.70.0/boost_1_70_0.tar.gz'
addons:
apt:
sources:
- ubuntu-toolchain-r-test
- llvm-toolchain-xenial-7
packages:
- gcc-7
- g++-7
- gcc-8
- g++-8
- python-software-properties
- protobuf-compiler
- libprotobuf-dev
- libssl-dev
- libstdc++6
- binutils-gold
- cmake
- lcov
- llvm-7
- clang-7
- DOCKER_IMAGE="rippleci/rippled-ci-builder:2020-01-08"
- CMAKE_EXTRA_ARGS="-Dwerr=ON -Dwextra=ON"
- NINJA_BUILD=true
# change this if we get more VM capacity
- MAX_TIME_MIN=80
- CACHE_DIR=${TRAVIS_HOME}/_cache
- NIH_CACHE_ROOT=${CACHE_DIR}/nih_c
- PARALLEL_TESTS=true
# this is NOT used by linux container based builds (which already have boost installed)
- BOOST_URL='https://dl.bintray.com/boostorg/release/1.70.0/source/boost_1_70_0.tar.bz2'
- VCPKG_DIR=${CACHE_DIR}/vcpkg
- USE_CCACHE=true
- CCACHE_BASEDIR=${TRAVIS_HOME}"
- CCACHE_NOHASHDIR=true
- CCACHE_DIR=${CACHE_DIR}/ccache
matrix:
fast_finish: true
allow_failures:
# TODO these need more investigation
#
# there are a number of UBs caught currently that need triage
- name: ubsan, clang-8
# this one often runs out of memory:
- name: manual tests, gcc-8, release
include:
- compiler: gcc
env: GCC_VER=7 BUILD_TYPE=Debug
- compiler: clang
env: GCC_VER=7 BUILD_TYPE=Debug
# debug builds
- &linux
stage: one
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/
compiler: gcc-8
name: gcc-8, debug
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
script:
- sudo chmod -R a+rw ${CACHE_DIR}
- ccache -s
- travis_wait ${MAX_TIME_MIN} bin/ci/ubuntu/build-in-docker.sh
- ccache -s
- <<: *linux
compiler: clang-8
name: clang-8, debug
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Debug
# coverage builds
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_cov/
compiler: gcc-8
name: coverage, gcc-8
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dcoverage=ON"
- TARGET=coverage_report
- SKIP_TESTS=true
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_cov/
compiler: clang-8
name: coverage, clang-8
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dcoverage=ON"
- TARGET=coverage_report
- SKIP_TESTS=true
# nounity
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_nounity/
compiler: gcc-8
name: non-unity, gcc-8
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dunity=OFF"
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_nounity/
compiler: clang-8
name: non-unity, clang-8
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dunity=OFF"
# manual tests
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_man/
compiler: gcc-8
name: manual tests, gcc-8, debug
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- MANUAL_TESTS=true
# manual tests
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_man/
compiler: gcc-8
name: manual tests, gcc-8, release
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Release
- CMAKE_ADD="-Dassert=ON -Dunity=OFF"
- MANUAL_TESTS=true
# release builds
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_release/
compiler: gcc-8
name: gcc-8, release
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Release
- CMAKE_ADD="-Dassert=ON -Dunity=OFF"
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_release/
compiler: clang-8
name: clang-8, release
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Release
- CMAKE_ADD="-Dassert=ON"
# asan
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_san/
compiler: clang-8
name: asan, clang-8
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Release
- CMAKE_ADD="-Dsan=address"
- ASAN_OPTIONS="print_stats=true:atexit=true"
#- LSAN_OPTIONS="verbosity=1:log_threads=1"
- PARALLEL_TESTS=false
# ubsan
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_san/
compiler: clang-8
name: ubsan, clang-8
env:
- MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
- BUILD_TYPE=Release
- CMAKE_ADD="-Dsan=undefined"
# once we can run clean under ubsan, add halt_on_error=1 to options below
- UBSAN_OPTIONS="print_stacktrace=1:report_error_type=1"
- PARALLEL_TESTS=false
# tsan
# current tsan failure *might* be related to:
# https://github.com/google/sanitizers/issues/1104
# but we can't get it to run, so leave it disabled for now
# - <<: *linux
# if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_linux/ OR commit_message =~ /travis_run_san/
# compiler: clang-8
# name: tsan, clang-8
# env:
# - MATRIX_EVAL="CC=clang-8 && CXX=clang++-8"
# - BUILD_TYPE=Release
# - CMAKE_ADD="-Dsan=thread"
# - TSAN_OPTIONS="history_size=3 external_symbolizer_path=/usr/bin/llvm-symbolizer verbosity=1"
# - PARALLEL_TESTS=false
# dynamic lib builds
- <<: *linux
compiler: gcc-8
name: non-static, gcc-8
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dstatic=OFF"
- <<: *linux
compiler: gcc-8
name: non-static + BUILD_SHARED_LIBS, gcc-8
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dstatic=OFF -DBUILD_SHARED_LIBS=ON"
# makefile
- <<: *linux
compiler: gcc-8
name: makefile generator, gcc-8
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- NINJA_BUILD=false
# misc alternative compilers
- <<: *linux
compiler: gcc-7
name: gcc-7
env:
- MATRIX_EVAL="CC=gcc-7 && CXX=g++-7"
- BUILD_TYPE=Debug
- <<: *linux
compiler: gcc-9
name: gcc-9
env:
- MATRIX_EVAL="CC=gcc-9 && CXX=g++-9"
- BUILD_TYPE=Debug
- <<: *linux
compiler: clang-7
name: clang-7
env:
- MATRIX_EVAL="CC=clang-7 && CXX=clang++-7"
- BUILD_TYPE=Debug
- <<: *linux
compiler: clang-9
name: clang-9
env:
- MATRIX_EVAL="CC=clang-9 && CXX=clang++-9"
- BUILD_TYPE=Debug
# verify build with min version of cmake
- <<: *linux
compiler: gcc-8
name: min cmake version
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_EXE=/opt/local/cmake-3.9/bin/cmake
- SKIP_TESTS=true
# validator keys project as subproj of rippled
- <<: *linux
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_vkeys/
compiler: gcc-8
name: validator-keys
env:
- MATRIX_EVAL="CC=gcc-8 && CXX=g++-8"
- BUILD_TYPE=Debug
- CMAKE_ADD="-Dvalidator_keys=ON"
- TARGET=validator-keys
# macos
- &macos
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_mac/
stage: one
os: osx
osx_image: xcode10.3
name: xcode10, debug
env:
# put NIH in non-cache location since it seems to
# cause failures when homebrew updates
- NIH_CACHE_ROOT=${TRAVIS_BUILD_DIR}/nih_c
- BLD_CONFIG=Debug
- TEST_EXTRA_ARGS=""
- BOOST_ROOT=${CACHE_DIR}/boost_1_70_0
- >-
CMAKE_ADD="
-DBOOST_ROOT=${BOOST_ROOT}/_INSTALLED_
-DBoost_ARCHITECTURE=-x64
-DBoost_NO_SYSTEM_PATHS=ON
-DCMAKE_VERBOSE_MAKEFILE=ON"
addons:
homebrew:
packages:
- protobuf
- grpc
- pkg-config
- bash
- ninja
- cmake
- wget
- zstd
- libarchive
- openssl@1.1
update: true
install:
- export OPENSSL_ROOT=$(brew --prefix openssl@1.1)
- travis_wait ${MAX_TIME_MIN} Builds/containers/shared/install_boost.sh
- brew uninstall --ignore-dependencies boost
script:
- mkdir -p build.macos && cd build.macos
- cmake -G Ninja ${CMAKE_EXTRA_ARGS} -DCMAKE_BUILD_TYPE=${BLD_CONFIG} ..
- travis_wait ${MAX_TIME_MIN} cmake --build . --parallel --verbose
- ./rippled --unittest --quiet --unittest-log --unittest-jobs ${NUM_PROCESSORS} ${TEST_EXTRA_ARGS}
- <<: *macos
name: xcode10, release
before_script:
- export BLD_CONFIG=Release
- export CMAKE_EXTRA_ARGS="${CMAKE_EXTRA_ARGS} -Dassert=ON"
- <<: *macos
name: ipv6 (macos)
before_script:
- export TEST_EXTRA_ARGS="--unittest-ipv6"
- <<: *macos
osx_image: xcode11.2
name: xcode11, debug
# windows
- &windows
if: commit_message !~ /travis_run_/ OR commit_message =~ /travis_run_win/
os: windows
env:
# put NIH in a non-cached location until
# we come up with a way to stabilize that
# cache on windows (minimize incremental changes)
- CACHE_NAME=win_01
- NIH_CACHE_ROOT=${TRAVIS_BUILD_DIR}/nih_c
- VCPKG_DEFAULT_TRIPLET="x64-windows-static"
- MATRIX_EVAL="CC=cl.exe && CXX=cl.exe"
- BOOST_ROOT=${CACHE_DIR}/boost_1_70
- >-
CMAKE_ADD="
-DCMAKE_PREFIX_PATH=${BOOST_ROOT}/_INSTALLED_
-DBOOST_ROOT=${BOOST_ROOT}/_INSTALLED_
-DBoost_ROOT=${BOOST_ROOT}/_INSTALLED_
-DBoost_DIR=${BOOST_ROOT}/_INSTALLED_/lib/cmake/Boost-1.70.0
-DBoost_COMPILER=vc141
-DCMAKE_VERBOSE_MAKEFILE=ON
-DCMAKE_TOOLCHAIN_FILE=${VCPKG_DIR}/scripts/buildsystems/vcpkg.cmake
-DVCPKG_TARGET_TRIPLET=x64-windows-static"
stage: one
name: prereq-ssl
install:
- choco upgrade cmake.install
- choco install ninja visualstudio2017-workload-vctools -y
script:
- df -h
- travis_wait ${MAX_TIME_MIN} bin/sh/install-vcpkg.sh openssl
- <<: *windows
stage: two
name: prereq-grpc
script:
- travis_wait ${MAX_TIME_MIN} bin/sh/install-vcpkg.sh grpc
- <<: *windows
stage: three
name: prereq-libarchive
script:
- travis_wait ${MAX_TIME_MIN} bin/sh/install-vcpkg.sh libarchive[lz4]
# TBD consider rocksdb via vcpkg if/when we can build with the
# vcpkg version
# - travis_wait ${MAX_TIME_MIN} bin/sh/install-vcpkg.sh rocksdb[snappy,lz4,zlib]
- <<: *windows
stage: four
name: prereq-boost
install:
- choco upgrade cmake.install
- choco install ninja visualstudio2017-workload-vctools -y
#Force install 14.24 to fix build issue. TODO - this should be deleted when rocksdb fixes their issue with the new compiler.
- choco install visualstudio2019buildtools visualstudio2019community -y
- choco install visualstudio2019-workload-vctools --package-parameters "--add Microsoft.VisualStudio.Component.VC.14.24.x86.x64" -y
script:
- export BOOST_TOOLSET=msvc-14.1
- travis_wait ${MAX_TIME_MIN} Builds/containers/shared/install_boost.sh
- &windows-bld
<<: *windows
stage: five
name: windows, debug
before_script:
- export BLD_CONFIG=Debug
script:
- df -h
- . ./bin/sh/setup-msvc.sh
- mkdir -p build.ms && cd build.ms
- cmake -G Ninja ${CMAKE_EXTRA_ARGS} -DCMAKE_BUILD_TYPE=${BLD_CONFIG} ..
- travis_wait ${MAX_TIME_MIN} cmake --build . --parallel --verbose
# override num procs to force single unit test job
- export NUM_PROCESSORS=1
- travis_wait ${MAX_TIME_MIN} ./rippled.exe --unittest --quiet --unittest-log --unittest-jobs ${NUM_PROCESSORS}
- <<: *windows-bld
name: windows, release
before_script:
- export BLD_CONFIG=Release
- <<: *windows-bld
name: windows, visual studio, debug
script:
- mkdir -p build.ms && cd build.ms
- cmake -G "Visual Studio 15 2017 Win64" ${CMAKE_EXTRA_ARGS} ..
- export DESTDIR=${PWD}/_installed_
- travis_wait ${MAX_TIME_MIN} cmake --build . --parallel --verbose --config ${BLD_CONFIG} --target install
# override num procs to force single unit test job
- export NUM_PROCESSORS=1
- >-
travis_wait ${MAX_TIME_MIN} "./_installed_/Program Files/rippled/bin/rippled.exe" --unittest --quiet --unittest-log --unittest-jobs ${NUM_PROCESSORS}
- <<: *windows-bld
name: windows, vc2019
install:
- choco upgrade cmake.install
- choco install ninja -y
#Force install 14.24 to fix build issue. TODO - this should be deleted when rocksdb fixes their issue with the new compiler.
- choco install visualstudio2019buildtools visualstudio2019community -y
- choco install visualstudio2019-workload-vctools --package-parameters "--add Microsoft.VisualStudio.Component.VC.14.24.x86.x64" -y
before_script:
- export BLD_CONFIG=Release
# we want to use the boost build from cache, which was built using the
# vs2017 compiler so we need to specify the Boost_COMPILER. BUT, we
# can't use the cmake config files generated by boost b/c they are
# broken for Boost_COMPILER override, so we need to specify both
# Boost_NO_BOOST_CMAKE and a slightly different Boost_COMPILER string
# to make the legacy find module work for us. If the cmake configs are
# fixed in the future, it should be possible to remove these
# workarounds.
- export CMAKE_EXTRA_ARGS="${CMAKE_EXTRA_ARGS} -DBoost_NO_BOOST_CMAKE=ON -DBoost_COMPILER=-vc141"
before_cache:
- if [ $(uname) = "Linux" ] ; then SUDO="sudo"; else SUDO=""; fi
- cd ${TRAVIS_HOME}
- if [ -f cache_ignore.tar ] ; then $SUDO tar xvf cache_ignore.tar; fi
- cd ${TRAVIS_BUILD_DIR}
cache:
timeout: 900
directories:
- $BOOST_ROOT
- .nih_c
- $CACHE_DIR
before_install:
- bin/ci/ubuntu/install-dependencies.sh
script:
- travis_wait 35 bin/ci/ubuntu/build-and-test.sh
- if [ "$(uname)" = "Darwin" ] ; then export NUM_PROCESSORS=$(sysctl -n hw.physicalcpu); else export NUM_PROCESSORS=$(nproc); fi
- echo "NUM PROC is ${NUM_PROCESSORS}"
- if [ "$(uname)" = "Linux" ] ; then docker pull ${DOCKER_IMAGE}; fi
- if [ "${MATRIX_EVAL}" != "" ] ; then eval "${MATRIX_EVAL}"; fi
- if [ "${CMAKE_ADD}" != "" ] ; then export CMAKE_EXTRA_ARGS="${CMAKE_EXTRA_ARGS} ${CMAKE_ADD}"; fi
- bin/ci/ubuntu/travis-cache-start.sh
notifications:
email:
false
irc:
channels:
- "chat.freenode.net#ripple-dev"
email: false

View File

@@ -1,95 +1,4 @@
## "target" parsing..DEPRECATED and will be removed in future
macro(parse_target)
if (target)
# Parse the target
set(remaining ${target})
while (remaining)
# get the component up to the next dot or end
string(REGEX REPLACE "^\\.?([^\\.]+).*$" "\\1" cur_component ${remaining})
string(REGEX REPLACE "^\\.?[^\\.]+(.*$)" "\\1" remaining ${remaining})
if (${cur_component} STREQUAL gcc)
if (DEFINED ENV{GNU_CC})
set(CMAKE_C_COMPILER $ENV{GNU_CC})
elseif ($ENV{CC} MATCHES .*gcc.*)
set(CMAKE_C_COMPILER $ENV{CC})
else()
find_program(CMAKE_C_COMPILER gcc)
endif()
if (DEFINED ENV{GNU_CXX})
set(CMAKE_CXX_COMPILER $ENV{GNU_CXX})
elseif ($ENV{CXX} MATCHES .*g\\+\\+.*)
set(CMAKE_CXX_COMPILER $ENV{CXX})
else()
find_program(CMAKE_CXX_COMPILER g++)
endif()
endif()
if (${cur_component} STREQUAL clang)
if (DEFINED ENV{CLANG_CC})
set(CMAKE_C_COMPILER $ENV{CLANG_CC})
elseif ($ENV{CC} MATCHES .*clang.*)
set(CMAKE_C_COMPILER $ENV{CC})
else()
find_program(CMAKE_C_COMPILER clang)
endif()
if (DEFINED ENV{CLANG_CXX})
set(CMAKE_CXX_COMPILER $ENV{CLANG_CXX})
elseif ($ENV{CXX} MATCHES .*clang.*)
set(CMAKE_CXX_COMPILER $ENV{CXX})
else()
find_program(CMAKE_CXX_COMPILER clang++)
endif()
endif()
if (${cur_component} STREQUAL msvc)
# TBD
endif()
if (${cur_component} STREQUAL unity)
set(unity ON CACHE BOOL "" FORCE)
endif()
if (${cur_component} STREQUAL nounity)
set(unity OFF CACHE BOOL "" FORCE)
endif()
if (${cur_component} STREQUAL debug)
set(release false)
endif()
if (${cur_component} STREQUAL release)
set(release true)
endif()
if (${cur_component} STREQUAL coverage)
set(coverage ON CACHE BOOL "" FORCE)
set(debug true)
endif()
if (${cur_component} STREQUAL profile)
set(profile ON CACHE BOOL "" FORCE)
endif()
endwhile()
endif()
if(CMAKE_C_COMPILER MATCHES "-NOTFOUND$" OR
CMAKE_CXX_COMPILER MATCHES "-NOTFOUND$")
message(FATAL_ERROR "Can not find appropriate compiler for target ${target}")
endif()
if (release)
set(CMAKE_BUILD_TYPE Release)
else()
set(CMAKE_BUILD_TYPE Debug)
endif()
endmacro()
############################################################
macro(group_sources_in source_dir curdir)
file(GLOB children RELATIVE ${source_dir}/${curdir}
${source_dir}/${curdir}/*)
@@ -259,7 +168,7 @@ function (git_hash hash_val)
if (NOT GIT_FOUND)
return ()
endif ()
set (_hash "unknown")
set (_hash "")
set (_format "%H")
if (ARGC GREATER_EQUAL 2)
string (TOLOWER ${ARGV1} _short)
@@ -278,3 +187,21 @@ function (git_hash hash_val)
endif ()
set (${hash_val} "${_hash}" PARENT_SCOPE)
endfunction ()
function (git_branch branch_val)
if (NOT GIT_FOUND)
return ()
endif ()
set (_branch "")
execute_process (COMMAND ${GIT_EXECUTABLE} "rev-parse" "--abbrev-ref" "HEAD"
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
RESULT_VARIABLE _git_exit_code
OUTPUT_VARIABLE _temp_branch
OUTPUT_STRIP_TRAILING_WHITESPACE
ERROR_QUIET)
if (_git_exit_code EQUAL 0)
set (_branch ${_temp_branch})
endif ()
set (${branch_val} "${_branch}" PARENT_SCOPE)
endfunction ()

View File

@@ -0,0 +1,62 @@
set (RocksDB_DIR "" CACHE PATH "Root directory of RocksDB distribution")
find_path (RocksDB_INCLUDE_DIR
rocksdb/db.h
PATHS ${RocksDB_DIR})
set (RocksDB_VERSION "")
find_file (RocksDB_VERSION_FILE
rocksdb/version.h
PATHS ${RocksDB_DIR})
if (RocksDB_VERSION_FILE)
file (READ ${RocksDB_VERSION_FILE} _verfile)
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_MAJOR[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION "${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION "0")
endif()
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_MINOR[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION ".${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION ".0")
endif()
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_PATCH[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION ".${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION ".0")
endif()
endif ()
if (RocksDB_USE_STATIC)
list (APPEND RocksDB_NAMES
"${CMAKE_STATIC_LIBRARY_PREFIX}rocksdb${CMAKE_STATIC_LIBRARY_SUFFIX}"
"${CMAKE_STATIC_LIBRARY_PREFIX}rocksdblib${CMAKE_STATIC_LIBRARY_SUFFIX}")
endif ()
list (APPEND RocksDB_NAMES rocksdb)
find_library (RocksDB_LIBRARY NAMES ${RocksDB_NAMES}
PATHS
${RocksDB_DIR}
${RocksDB_DIR}/bin/Release
${RocksDB_DIR}/bin64_vs2013/Release
PATH_SUFFIXES lib lib64)
foreach (_n RocksDB_NAMES)
list (APPEND RocksDB_NAMES_DBG "${_n}_d" "${_n}d")
endforeach ()
find_library (RocksDB_LIBRARY_DEBUG NAMES ${RocksDB_NAMES_DBG}
PATHS
${RocksDB_DIR}
${RocksDB_DIR}/bin/Debug
${RocksDB_DIR}/bin64_vs2013/Debug
PATH_SUFFIXES lib lib64)
include (FindPackageHandleStandardArgs)
find_package_handle_standard_args (RocksDB
REQUIRED_VARS RocksDB_LIBRARY RocksDB_INCLUDE_DIR
VERSION_VAR RocksDB_VERSION)
mark_as_advanced (RocksDB_INCLUDE_DIR RocksDB_LIBRARY)
set (RocksDB_INCLUDE_DIRS ${RocksDB_INCLUDE_DIR})
set (RocksDB_LIBRARIES ${RocksDB_LIBRARY})

View File

@@ -12,7 +12,7 @@ if (static OR MSVC)
else ()
set (Boost_USE_STATIC_RUNTIME OFF)
endif ()
find_dependency (Boost 1.67
find_dependency (Boost 1.70
COMPONENTS
chrono
context

View File

@@ -0,0 +1,179 @@
#[===================================================================[
setup project-wide compiler settings
#]===================================================================]
#[=========================================================[
TODO some/most of these common settings belong in a
toolchain file, especially the ABI-impacting ones
#]=========================================================]
add_library (common INTERFACE)
add_library (Ripple::common ALIAS common)
# add a single global dependency on this interface lib
link_libraries (Ripple::common)
set_target_properties (common
PROPERTIES INTERFACE_POSITION_INDEPENDENT_CODE ON)
set(CMAKE_CXX_EXTENSIONS OFF)
target_compile_features (common INTERFACE cxx_std_17)
target_compile_definitions (common
INTERFACE
$<$<CONFIG:Debug>:DEBUG _DEBUG>
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>)
# ^^^^ NOTE: CMAKE release builds already have NDEBUG
# defined, so no need to add it explicitly except for
# this special case of (profile ON) and (assert OFF)
# -- presumably this is because we don't want profile
# builds asserting unless asserts were specifically
# requested
if (MSVC)
# remove existing exception flag since we set it to -EHa
string (REGEX REPLACE "[-/]EH[a-z]+" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
foreach (var_
CMAKE_C_FLAGS_DEBUG CMAKE_C_FLAGS_RELEASE
CMAKE_CXX_FLAGS_DEBUG CMAKE_CXX_FLAGS_RELEASE)
# also remove dynamic runtime
string (REGEX REPLACE "[-/]MD[d]*" " " ${var_} "${${var_}}")
# /ZI (Edit & Continue debugging information) is incompatible with Gy-
string (REPLACE "/ZI" "/Zi" ${var_} "${${var_}}")
# omit debug info completely under CI (not needed)
if (is_ci)
string (REPLACE "/Zi" " " ${var_} "${${var_}}")
endif ()
endforeach ()
target_compile_options (common
INTERFACE
-bigobj # Increase object file max size
-fp:precise # Floating point behavior
-Gd # __cdecl calling convention
-Gm- # Minimal rebuild: disabled
-Gy- # Function level linking: disabled
-MP # Multiprocessor compilation
-openmp- # pragma omp: disabled
-errorReport:none # No error reporting to Internet
-nologo # Suppress login banner
-wd4018 # Disable signed/unsigned comparison warnings
-wd4244 # Disable float to int possible loss of data warnings
-wd4267 # Disable size_t to T possible loss of data warnings
-wd4800 # Disable C4800(int to bool performance)
-wd4503 # Decorated name length exceeded, name was truncated
$<$<COMPILE_LANGUAGE:CXX>:
-EHa
-GR
>
$<$<CONFIG:Release>:-Ox>
$<$<AND:$<COMPILE_LANGUAGE:CXX>,$<CONFIG:Debug>>:
-GS
-Zc:forScope
>
# static runtime
$<$<CONFIG:Debug>:-MTd>
$<$<NOT:$<CONFIG:Debug>>:-MT>
$<$<BOOL:${werr}>:-WX>
)
target_compile_definitions (common
INTERFACE
_WIN32_WINNT=0x6000
_SCL_SECURE_NO_WARNINGS
_CRT_SECURE_NO_WARNINGS
WIN32_CONSOLE
WIN32_LEAN_AND_MEAN
NOMINMAX
# TODO: Resolve these warnings, don't just silence them
_SILENCE_ALL_CXX17_DEPRECATION_WARNINGS
$<$<AND:$<COMPILE_LANGUAGE:CXX>,$<CONFIG:Debug>>:_CRTDBG_MAP_ALLOC>)
target_link_libraries (common
INTERFACE
-errorreport:none
-machine:X64)
else ()
# HACK : because these need to come first, before any warning demotion
string (APPEND CMAKE_CXX_FLAGS " -Wall -Wdeprecated")
if (wextra)
string (APPEND CMAKE_CXX_FLAGS " -Wextra -Wno-unused-parameter")
endif ()
# not MSVC
target_compile_options (common
INTERFACE
$<$<BOOL:${werr}>:-Werror>
$<$<COMPILE_LANGUAGE:CXX>:
-frtti
-Wnon-virtual-dtor
>
-Wno-sign-compare
-Wno-char-subscripts
-Wno-format
-Wno-unused-local-typedefs
$<$<BOOL:${is_gcc}>:
-Wno-unused-but-set-variable
-Wno-deprecated
>
$<$<NOT:$<CONFIG:Debug>>:-fno-strict-aliasing>
# tweak gcc optimization for debug
$<$<AND:$<BOOL:${is_gcc}>,$<CONFIG:Debug>>:-O0>
# Add debug symbols to release config
$<$<CONFIG:Release>:-g>)
target_link_libraries (common
INTERFACE
-rdynamic
# link to static libc/c++ iff:
# * static option set and
# * NOT APPLE (AppleClang does not support static libc/c++) and
# * NOT san (sanitizers typically don't work with static libc/c++)
$<$<AND:$<BOOL:${static}>,$<NOT:$<BOOL:${APPLE}>>,$<NOT:$<BOOL:${san}>>>:-static-libstdc++>)
endif ()
if (use_gold AND is_gcc)
# use gold linker if available
execute_process (
COMMAND ${CMAKE_CXX_COMPILER} -fuse-ld=gold -Wl,--version
ERROR_QUIET OUTPUT_VARIABLE LD_VERSION)
#[=========================================================[
NOTE: THE gold linker inserts -rpath as DT_RUNPATH by
default intead of DT_RPATH, so you might have slightly
unexpected runtime ld behavior if you were expecting
DT_RPATH. Specify --disable-new-dtags to gold if you do
not want the default DT_RUNPATH behavior. This rpath
treatment as well as static/dynamic selection means that
gold does not currently have ideal default behavior when
we are using jemalloc. Thus for simplicity we don't use
it when jemalloc is requested. An alternative to
disabling would be to figure out all the settings
required to make gold play nicely with jemalloc.
#]=========================================================]
if (("${LD_VERSION}" MATCHES "GNU gold") AND (NOT jemalloc))
target_link_libraries (common
INTERFACE
-fuse-ld=gold
-Wl,--no-as-needed
#[=========================================================[
see https://bugs.launchpad.net/ubuntu/+source/eglibc/+bug/1253638/comments/5
DT_RUNPATH does not work great for transitive
dependencies (of which boost has a few) - so just
switch to DT_RPATH if doing dynamic linking with gold
#]=========================================================]
$<$<NOT:$<BOOL:${static}>>:-Wl,--disable-new-dtags>)
endif ()
unset (LD_VERSION)
endif ()
if (use_lld)
# use lld linker if available
execute_process (
COMMAND ${CMAKE_CXX_COMPILER} -fuse-ld=lld -Wl,--version
ERROR_QUIET OUTPUT_VARIABLE LD_VERSION)
if ("${LD_VERSION}" MATCHES "LLD")
target_link_libraries (common INTERFACE -fuse-ld=lld)
endif ()
unset (LD_VERSION)
endif()
if (assert)
foreach (var_ CMAKE_C_FLAGS_RELEASE CMAKE_CXX_FLAGS_RELEASE)
STRING (REGEX REPLACE "[-/]DNDEBUG" "" ${var_} "${${var_}}")
endforeach ()
endif ()

View File

@@ -0,0 +1,967 @@
#[===================================================================[
xrpl_core
core functionality, useable by some client software perhaps
#]===================================================================]
file (GLOB_RECURSE rb_headers
src/ripple/beast/*.h
src/ripple/beast/*.hpp)
add_library (xrpl_core
${rb_headers}) ## headers added here for benefit of IDEs
if (unity)
set_target_properties(xrpl_core PROPERTIES UNITY_BUILD ON)
endif ()
#[===============================[
beast/legacy FILES:
TODO: review these sources for removal or replacement
#]===============================]
target_sources (xrpl_core PRIVATE
src/ripple/beast/core/CurrentThreadName.cpp
src/ripple/beast/core/SemanticVersion.cpp
src/ripple/beast/hash/impl/xxhash.cpp
src/ripple/beast/insight/impl/Collector.cpp
src/ripple/beast/insight/impl/Groups.cpp
src/ripple/beast/insight/impl/Hook.cpp
src/ripple/beast/insight/impl/Metric.cpp
src/ripple/beast/insight/impl/NullCollector.cpp
src/ripple/beast/insight/impl/StatsDCollector.cpp
src/ripple/beast/net/impl/IPAddressConversion.cpp
src/ripple/beast/net/impl/IPAddressV4.cpp
src/ripple/beast/net/impl/IPAddressV6.cpp
src/ripple/beast/net/impl/IPEndpoint.cpp
src/ripple/beast/utility/src/beast_Journal.cpp
src/ripple/beast/utility/src/beast_PropertyStream.cpp)
#[===============================[
core sources
#]===============================]
target_sources (xrpl_core PRIVATE
#[===============================[
main sources:
subdir: basics (partial)
#]===============================]
src/ripple/basics/impl/base64.cpp
src/ripple/basics/impl/contract.cpp
src/ripple/basics/impl/CountedObject.cpp
src/ripple/basics/impl/FileUtilities.cpp
src/ripple/basics/impl/IOUAmount.cpp
src/ripple/basics/impl/Log.cpp
src/ripple/basics/impl/strHex.cpp
src/ripple/basics/impl/StringUtilities.cpp
#[===============================[
main sources:
subdir: json
#]===============================]
src/ripple/json/impl/JsonPropertyStream.cpp
src/ripple/json/impl/Object.cpp
src/ripple/json/impl/Output.cpp
src/ripple/json/impl/Writer.cpp
src/ripple/json/impl/json_reader.cpp
src/ripple/json/impl/json_value.cpp
src/ripple/json/impl/json_valueiterator.cpp
src/ripple/json/impl/json_writer.cpp
src/ripple/json/impl/to_string.cpp
#[===============================[
main sources:
subdir: protocol
#]===============================]
src/ripple/protocol/impl/AccountID.cpp
src/ripple/protocol/impl/Book.cpp
src/ripple/protocol/impl/BuildInfo.cpp
src/ripple/protocol/impl/ErrorCodes.cpp
src/ripple/protocol/impl/Feature.cpp
src/ripple/protocol/impl/Indexes.cpp
src/ripple/protocol/impl/InnerObjectFormats.cpp
src/ripple/protocol/impl/Issue.cpp
src/ripple/protocol/impl/Keylet.cpp
src/ripple/protocol/impl/LedgerFormats.cpp
src/ripple/protocol/impl/PublicKey.cpp
src/ripple/protocol/impl/Quality.cpp
src/ripple/protocol/impl/Rate2.cpp
src/ripple/protocol/impl/SField.cpp
src/ripple/protocol/impl/SOTemplate.cpp
src/ripple/protocol/impl/STAccount.cpp
src/ripple/protocol/impl/STAmount.cpp
src/ripple/protocol/impl/STArray.cpp
src/ripple/protocol/impl/STBase.cpp
src/ripple/protocol/impl/STBlob.cpp
src/ripple/protocol/impl/STInteger.cpp
src/ripple/protocol/impl/STLedgerEntry.cpp
src/ripple/protocol/impl/STObject.cpp
src/ripple/protocol/impl/STParsedJSON.cpp
src/ripple/protocol/impl/STPathSet.cpp
src/ripple/protocol/impl/STTx.cpp
src/ripple/protocol/impl/STValidation.cpp
src/ripple/protocol/impl/STVar.cpp
src/ripple/protocol/impl/STVector256.cpp
src/ripple/protocol/impl/SecretKey.cpp
src/ripple/protocol/impl/Seed.cpp
src/ripple/protocol/impl/Serializer.cpp
src/ripple/protocol/impl/Sign.cpp
src/ripple/protocol/impl/TER.cpp
src/ripple/protocol/impl/TxFormats.cpp
src/ripple/protocol/impl/UintTypes.cpp
src/ripple/protocol/impl/digest.cpp
src/ripple/protocol/impl/tokens.cpp
#[===============================[
main sources:
subdir: crypto
#]===============================]
src/ripple/crypto/impl/GenerateDeterministicKey.cpp
src/ripple/crypto/impl/RFC1751.cpp
src/ripple/crypto/impl/csprng.cpp
src/ripple/crypto/impl/ec_key.cpp
src/ripple/crypto/impl/openssl.cpp)
add_library (Ripple::xrpl_core ALIAS xrpl_core)
target_include_directories (xrpl_core
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src/ripple>
# this one is for beast/legacy files:
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src/beast/extras>
$<INSTALL_INTERFACE:include>)
target_compile_options (xrpl_core
PUBLIC
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>)
target_link_libraries (xrpl_core
PUBLIC
OpenSSL::Crypto
Ripple::boost
Ripple::syslibs
NIH::secp256k1
NIH::ed25519-donna
date::date
Ripple::opts)
#[=================================[
main/core headers installation
#]=================================]
install (
FILES
src/ripple/basics/base64.h
src/ripple/basics/Blob.h
src/ripple/basics/Buffer.h
src/ripple/basics/CountedObject.h
src/ripple/basics/FileUtilities.h
src/ripple/basics/IOUAmount.h
src/ripple/basics/LocalValue.h
src/ripple/basics/Log.h
src/ripple/basics/safe_cast.h
src/ripple/basics/Slice.h
src/ripple/basics/StringUtilities.h
src/ripple/basics/ToString.h
src/ripple/basics/UnorderedContainers.h
src/ripple/basics/XRPAmount.h
src/ripple/basics/algorithm.h
src/ripple/basics/base_uint.h
src/ripple/basics/chrono.h
src/ripple/basics/contract.h
src/ripple/basics/FeeUnits.h
src/ripple/basics/hardened_hash.h
src/ripple/basics/strHex.h
DESTINATION include/ripple/basics)
install (
FILES
src/ripple/crypto/GenerateDeterministicKey.h
src/ripple/crypto/RFC1751.h
src/ripple/crypto/csprng.h
DESTINATION include/ripple/crypto)
install (
FILES
src/ripple/crypto/impl/ec_key.h
src/ripple/crypto/impl/openssl.h
DESTINATION include/ripple/crypto/impl)
install (
FILES
src/ripple/json/JsonPropertyStream.h
src/ripple/json/Object.h
src/ripple/json/Output.h
src/ripple/json/Writer.h
src/ripple/json/json_forwards.h
src/ripple/json/json_reader.h
src/ripple/json/json_value.h
src/ripple/json/json_writer.h
src/ripple/json/to_string.h
DESTINATION include/ripple/json)
install (
FILES
src/ripple/json/impl/json_assert.h
DESTINATION include/ripple/json/impl)
install (
FILES
src/ripple/protocol/AccountID.h
src/ripple/protocol/AmountConversions.h
src/ripple/protocol/Book.h
src/ripple/protocol/BuildInfo.h
src/ripple/protocol/ErrorCodes.h
src/ripple/protocol/Feature.h
src/ripple/protocol/HashPrefix.h
src/ripple/protocol/Indexes.h
src/ripple/protocol/InnerObjectFormats.h
src/ripple/protocol/Issue.h
src/ripple/protocol/KeyType.h
src/ripple/protocol/Keylet.h
src/ripple/protocol/KnownFormats.h
src/ripple/protocol/LedgerFormats.h
src/ripple/protocol/Protocol.h
src/ripple/protocol/PublicKey.h
src/ripple/protocol/Quality.h
src/ripple/protocol/Rate.h
src/ripple/protocol/SField.h
src/ripple/protocol/SOTemplate.h
src/ripple/protocol/STAccount.h
src/ripple/protocol/STAmount.h
src/ripple/protocol/STArray.h
src/ripple/protocol/STBase.h
src/ripple/protocol/STBitString.h
src/ripple/protocol/STBlob.h
src/ripple/protocol/STExchange.h
src/ripple/protocol/STInteger.h
src/ripple/protocol/STLedgerEntry.h
src/ripple/protocol/STObject.h
src/ripple/protocol/STParsedJSON.h
src/ripple/protocol/STPathSet.h
src/ripple/protocol/STTx.h
src/ripple/protocol/STValidation.h
src/ripple/protocol/STVector256.h
src/ripple/protocol/SecretKey.h
src/ripple/protocol/Seed.h
src/ripple/protocol/Serializer.h
src/ripple/protocol/Sign.h
src/ripple/protocol/SystemParameters.h
src/ripple/protocol/TER.h
src/ripple/protocol/TxFlags.h
src/ripple/protocol/TxFormats.h
src/ripple/protocol/UintTypes.h
src/ripple/protocol/digest.h
src/ripple/protocol/jss.h
src/ripple/protocol/tokens.h
DESTINATION include/ripple/protocol)
install (
FILES
src/ripple/protocol/impl/STVar.h
src/ripple/protocol/impl/secp256k1.h
DESTINATION include/ripple/protocol/impl)
#[===================================[
beast/legacy headers installation
#]===================================]
install (
FILES
src/ripple/beast/clock/abstract_clock.h
src/ripple/beast/clock/basic_seconds_clock.h
src/ripple/beast/clock/manual_clock.h
DESTINATION include/ripple/beast/clock)
install (
FILES
src/ripple/beast/core/LexicalCast.h
src/ripple/beast/core/List.h
src/ripple/beast/core/SemanticVersion.h
DESTINATION include/ripple/beast/core)
install (
FILES
src/ripple/beast/crypto/detail/mac_facade.h
src/ripple/beast/crypto/detail/ripemd_context.h
src/ripple/beast/crypto/detail/sha2_context.h
src/ripple/beast/crypto/ripemd.h
src/ripple/beast/crypto/secure_erase.h
src/ripple/beast/crypto/sha2.h
DESTINATION include/ripple/beast/crypto)
install (
FILES
src/ripple/beast/cxx17/type_traits.h
DESTINATION include/ripple/beast/cxx17)
install (
FILES
src/ripple/beast/hash/endian.h
src/ripple/beast/hash/hash_append.h
src/ripple/beast/hash/meta.h
src/ripple/beast/hash/uhash.h
src/ripple/beast/hash/xxhasher.h
DESTINATION include/ripple/beast/hash)
install (
FILES src/ripple/beast/hash/impl/xxhash.h
DESTINATION include/ripple/beast/hash/impl)
install (
FILES
src/ripple/beast/rfc2616.h
src/ripple/beast/type_name.h
src/ripple/beast/unit_test.h
src/ripple/beast/xor_shift_engine.h
DESTINATION include/ripple/beast)
install (
FILES
src/ripple/beast/utility/Journal.h
src/ripple/beast/utility/PropertyStream.h
src/ripple/beast/utility/Zero.h
src/ripple/beast/utility/rngfill.h
DESTINATION include/ripple/beast/utility)
# WARNING!! -- horrible levelization ahead
# (these files should be isolated or moved...but
# unfortunately unit_test.h above creates this dependency)
install (
FILES
src/beast/extras/beast/unit_test/amount.hpp
src/beast/extras/beast/unit_test/dstream.hpp
src/beast/extras/beast/unit_test/global_suites.hpp
src/beast/extras/beast/unit_test/match.hpp
src/beast/extras/beast/unit_test/recorder.hpp
src/beast/extras/beast/unit_test/reporter.hpp
src/beast/extras/beast/unit_test/results.hpp
src/beast/extras/beast/unit_test/runner.hpp
src/beast/extras/beast/unit_test/suite.hpp
src/beast/extras/beast/unit_test/suite_info.hpp
src/beast/extras/beast/unit_test/suite_list.hpp
src/beast/extras/beast/unit_test/thread.hpp
DESTINATION include/beast/unit_test)
install (
FILES
src/beast/extras/beast/unit_test/detail/const_container.hpp
DESTINATION include/beast/unit_test/detail)
#[===================================================================[
rippled executable
#]===================================================================]
#[=========================================================[
this one header is added as source just to keep older
versions of cmake happy. cmake 3.10+ allows
add_executable with no sources
#]=========================================================]
add_executable (rippled src/ripple/app/main/Application.h)
if (unity)
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
endif ()
target_sources (rippled PRIVATE
#[===============================[
main sources:
subdir: app
#]===============================]
src/ripple/app/consensus/RCLConsensus.cpp
src/ripple/app/consensus/RCLCxPeerPos.cpp
src/ripple/app/consensus/RCLValidations.cpp
src/ripple/app/ledger/AcceptedLedger.cpp
src/ripple/app/ledger/AcceptedLedgerTx.cpp
src/ripple/app/ledger/AccountStateSF.cpp
src/ripple/app/ledger/BookListeners.cpp
src/ripple/app/ledger/ConsensusTransSetSF.cpp
src/ripple/app/ledger/Ledger.cpp
src/ripple/app/ledger/LedgerHistory.cpp
src/ripple/app/ledger/OrderBookDB.cpp
src/ripple/app/ledger/TransactionStateSF.cpp
src/ripple/app/ledger/impl/BuildLedger.cpp
src/ripple/app/ledger/impl/InboundLedger.cpp
src/ripple/app/ledger/impl/InboundLedgers.cpp
src/ripple/app/ledger/impl/InboundTransactions.cpp
src/ripple/app/ledger/impl/LedgerCleaner.cpp
src/ripple/app/ledger/impl/LedgerMaster.cpp
src/ripple/app/ledger/impl/LedgerReplay.cpp
src/ripple/app/ledger/impl/LedgerToJson.cpp
src/ripple/app/ledger/impl/LocalTxs.cpp
src/ripple/app/ledger/impl/OpenLedger.cpp
src/ripple/app/ledger/impl/TransactionAcquire.cpp
src/ripple/app/ledger/impl/TransactionMaster.cpp
src/ripple/app/main/Application.cpp
src/ripple/app/main/BasicApp.cpp
src/ripple/app/main/CollectorManager.cpp
src/ripple/app/main/GRPCServer.cpp
src/ripple/app/main/LoadManager.cpp
src/ripple/app/main/Main.cpp
src/ripple/app/main/NodeIdentity.cpp
src/ripple/app/main/NodeStoreScheduler.cpp
src/ripple/app/misc/CanonicalTXSet.cpp
src/ripple/app/misc/FeeVoteImpl.cpp
src/ripple/app/misc/HashRouter.cpp
src/ripple/app/misc/NetworkOPs.cpp
src/ripple/app/misc/SHAMapStoreImp.cpp
src/ripple/app/misc/impl/AccountTxPaging.cpp
src/ripple/app/misc/impl/AmendmentTable.cpp
src/ripple/app/misc/impl/LoadFeeTrack.cpp
src/ripple/app/misc/impl/Manifest.cpp
src/ripple/app/misc/impl/Transaction.cpp
src/ripple/app/misc/impl/TxQ.cpp
src/ripple/app/misc/impl/ValidatorKeys.cpp
src/ripple/app/misc/impl/ValidatorList.cpp
src/ripple/app/misc/impl/ValidatorSite.cpp
src/ripple/app/paths/AccountCurrencies.cpp
src/ripple/app/paths/Credit.cpp
src/ripple/app/paths/Flow.cpp
src/ripple/app/paths/PathRequest.cpp
src/ripple/app/paths/PathRequests.cpp
src/ripple/app/paths/Pathfinder.cpp
src/ripple/app/paths/RippleCalc.cpp
src/ripple/app/paths/RippleLineCache.cpp
src/ripple/app/paths/RippleState.cpp
src/ripple/app/paths/impl/BookStep.cpp
src/ripple/app/paths/impl/DirectStep.cpp
src/ripple/app/paths/impl/PaySteps.cpp
src/ripple/app/paths/impl/XRPEndpointStep.cpp
src/ripple/app/tx/impl/ApplyContext.cpp
src/ripple/app/tx/impl/BookTip.cpp
src/ripple/app/tx/impl/CancelCheck.cpp
src/ripple/app/tx/impl/CancelOffer.cpp
src/ripple/app/tx/impl/CancelTicket.cpp
src/ripple/app/tx/impl/CashCheck.cpp
src/ripple/app/tx/impl/Change.cpp
src/ripple/app/tx/impl/CreateCheck.cpp
src/ripple/app/tx/impl/CreateOffer.cpp
src/ripple/app/tx/impl/CreateTicket.cpp
src/ripple/app/tx/impl/DeleteAccount.cpp
src/ripple/app/tx/impl/DepositPreauth.cpp
src/ripple/app/tx/impl/Escrow.cpp
src/ripple/app/tx/impl/InvariantCheck.cpp
src/ripple/app/tx/impl/OfferStream.cpp
src/ripple/app/tx/impl/PayChan.cpp
src/ripple/app/tx/impl/Payment.cpp
src/ripple/app/tx/impl/SetAccount.cpp
src/ripple/app/tx/impl/SetRegularKey.cpp
src/ripple/app/tx/impl/SetSignerList.cpp
src/ripple/app/tx/impl/SetTrust.cpp
src/ripple/app/tx/impl/SignerEntries.cpp
src/ripple/app/tx/impl/Taker.cpp
src/ripple/app/tx/impl/Transactor.cpp
src/ripple/app/tx/impl/apply.cpp
src/ripple/app/tx/impl/applySteps.cpp
#[===============================[
main sources:
subdir: basics (partial)
#]===============================]
src/ripple/basics/impl/Archive.cpp
src/ripple/basics/impl/BasicConfig.cpp
src/ripple/basics/impl/PerfLogImp.cpp
src/ripple/basics/impl/ResolverAsio.cpp
src/ripple/basics/impl/Sustain.cpp
src/ripple/basics/impl/UptimeClock.cpp
src/ripple/basics/impl/make_SSLContext.cpp
src/ripple/basics/impl/mulDiv.cpp
#[===============================[
main sources:
subdir: conditions
#]===============================]
src/ripple/conditions/impl/Condition.cpp
src/ripple/conditions/impl/Fulfillment.cpp
src/ripple/conditions/impl/error.cpp
#[===============================[
main sources:
subdir: core
#]===============================]
src/ripple/core/impl/Config.cpp
src/ripple/core/impl/DatabaseCon.cpp
src/ripple/core/impl/Job.cpp
src/ripple/core/impl/JobQueue.cpp
src/ripple/core/impl/LoadEvent.cpp
src/ripple/core/impl/LoadMonitor.cpp
src/ripple/core/impl/SNTPClock.cpp
src/ripple/core/impl/SociDB.cpp
src/ripple/core/impl/Stoppable.cpp
src/ripple/core/impl/TimeKeeper.cpp
src/ripple/core/impl/Workers.cpp
#[===============================[
main sources:
subdir: consensus
#]===============================]
src/ripple/consensus/Consensus.cpp
#[===============================[
main sources:
subdir: ledger
#]===============================]
src/ripple/ledger/impl/ApplyStateTable.cpp
src/ripple/ledger/impl/ApplyView.cpp
src/ripple/ledger/impl/ApplyViewBase.cpp
src/ripple/ledger/impl/ApplyViewImpl.cpp
src/ripple/ledger/impl/BookDirs.cpp
src/ripple/ledger/impl/CachedSLEs.cpp
src/ripple/ledger/impl/CachedView.cpp
src/ripple/ledger/impl/CashDiff.cpp
src/ripple/ledger/impl/Directory.cpp
src/ripple/ledger/impl/OpenView.cpp
src/ripple/ledger/impl/PaymentSandbox.cpp
src/ripple/ledger/impl/RawStateTable.cpp
src/ripple/ledger/impl/ReadView.cpp
src/ripple/ledger/impl/TxMeta.cpp
src/ripple/ledger/impl/View.cpp
#[===============================[
main sources:
subdir: net
#]===============================]
src/ripple/net/impl/HTTPClient.cpp
src/ripple/net/impl/InfoSub.cpp
src/ripple/net/impl/RPCCall.cpp
src/ripple/net/impl/RPCErr.cpp
src/ripple/net/impl/RPCSub.cpp
src/ripple/net/impl/RegisterSSLCerts.cpp
src/ripple/net/impl/SSLHTTPDownloader.cpp
#[===============================[
main sources:
subdir: nodestore
#]===============================]
src/ripple/nodestore/backend/MemoryFactory.cpp
src/ripple/nodestore/backend/NuDBFactory.cpp
src/ripple/nodestore/backend/NullFactory.cpp
src/ripple/nodestore/backend/RocksDBFactory.cpp
src/ripple/nodestore/impl/BatchWriter.cpp
src/ripple/nodestore/impl/Database.cpp
src/ripple/nodestore/impl/DatabaseNodeImp.cpp
src/ripple/nodestore/impl/DatabaseRotatingImp.cpp
src/ripple/nodestore/impl/DatabaseShardImp.cpp
src/ripple/nodestore/impl/DecodedBlob.cpp
src/ripple/nodestore/impl/DummyScheduler.cpp
src/ripple/nodestore/impl/EncodedBlob.cpp
src/ripple/nodestore/impl/ManagerImp.cpp
src/ripple/nodestore/impl/NodeObject.cpp
src/ripple/nodestore/impl/Shard.cpp
#[===============================[
main sources:
subdir: overlay
#]===============================]
src/ripple/overlay/impl/Cluster.cpp
src/ripple/overlay/impl/ConnectAttempt.cpp
src/ripple/overlay/impl/Handshake.cpp
src/ripple/overlay/impl/Message.cpp
src/ripple/overlay/impl/OverlayImpl.cpp
src/ripple/overlay/impl/PeerImp.cpp
src/ripple/overlay/impl/PeerReservationTable.cpp
src/ripple/overlay/impl/PeerSet.cpp
src/ripple/overlay/impl/ProtocolVersion.cpp
src/ripple/overlay/impl/TrafficCount.cpp
#[===============================[
main sources:
subdir: peerfinder
#]===============================]
src/ripple/peerfinder/impl/Bootcache.cpp
src/ripple/peerfinder/impl/Endpoint.cpp
src/ripple/peerfinder/impl/PeerfinderConfig.cpp
src/ripple/peerfinder/impl/PeerfinderManager.cpp
src/ripple/peerfinder/impl/SlotImp.cpp
src/ripple/peerfinder/impl/SourceStrings.cpp
#[===============================[
main sources:
subdir: resource
#]===============================]
src/ripple/resource/impl/Charge.cpp
src/ripple/resource/impl/Consumer.cpp
src/ripple/resource/impl/Fees.cpp
src/ripple/resource/impl/ResourceManager.cpp
#[===============================[
main sources:
subdir: rpc
#]===============================]
src/ripple/rpc/handlers/AccountChannels.cpp
src/ripple/rpc/handlers/AccountCurrenciesHandler.cpp
src/ripple/rpc/handlers/AccountInfo.cpp
src/ripple/rpc/handlers/AccountLines.cpp
src/ripple/rpc/handlers/AccountObjects.cpp
src/ripple/rpc/handlers/AccountOffers.cpp
src/ripple/rpc/handlers/AccountTx.cpp
src/ripple/rpc/handlers/AccountTxOld.cpp
src/ripple/rpc/handlers/AccountTxSwitch.cpp
src/ripple/rpc/handlers/BlackList.cpp
src/ripple/rpc/handlers/BookOffers.cpp
src/ripple/rpc/handlers/CanDelete.cpp
src/ripple/rpc/handlers/Connect.cpp
src/ripple/rpc/handlers/ConsensusInfo.cpp
src/ripple/rpc/handlers/CrawlShards.cpp
src/ripple/rpc/handlers/DepositAuthorized.cpp
src/ripple/rpc/handlers/DownloadShard.cpp
src/ripple/rpc/handlers/Feature1.cpp
src/ripple/rpc/handlers/Fee1.cpp
src/ripple/rpc/handlers/FetchInfo.cpp
src/ripple/rpc/handlers/GatewayBalances.cpp
src/ripple/rpc/handlers/GetCounts.cpp
src/ripple/rpc/handlers/LedgerAccept.cpp
src/ripple/rpc/handlers/LedgerCleanerHandler.cpp
src/ripple/rpc/handlers/LedgerClosed.cpp
src/ripple/rpc/handlers/LedgerCurrent.cpp
src/ripple/rpc/handlers/LedgerData.cpp
src/ripple/rpc/handlers/LedgerEntry.cpp
src/ripple/rpc/handlers/LedgerHandler.cpp
src/ripple/rpc/handlers/LedgerHeader.cpp
src/ripple/rpc/handlers/LedgerRequest.cpp
src/ripple/rpc/handlers/LogLevel.cpp
src/ripple/rpc/handlers/LogRotate.cpp
src/ripple/rpc/handlers/Manifest.cpp
src/ripple/rpc/handlers/NoRippleCheck.cpp
src/ripple/rpc/handlers/OwnerInfo.cpp
src/ripple/rpc/handlers/PathFind.cpp
src/ripple/rpc/handlers/PayChanClaim.cpp
src/ripple/rpc/handlers/Peers.cpp
src/ripple/rpc/handlers/Ping.cpp
src/ripple/rpc/handlers/Print.cpp
src/ripple/rpc/handlers/Random.cpp
src/ripple/rpc/handlers/Reservations.cpp
src/ripple/rpc/handlers/RipplePathFind.cpp
src/ripple/rpc/handlers/ServerInfo.cpp
src/ripple/rpc/handlers/ServerState.cpp
src/ripple/rpc/handlers/SignFor.cpp
src/ripple/rpc/handlers/SignHandler.cpp
src/ripple/rpc/handlers/Stop.cpp
src/ripple/rpc/handlers/Submit.cpp
src/ripple/rpc/handlers/SubmitMultiSigned.cpp
src/ripple/rpc/handlers/Subscribe.cpp
src/ripple/rpc/handlers/TransactionEntry.cpp
src/ripple/rpc/handlers/Tx.cpp
src/ripple/rpc/handlers/TxHistory.cpp
src/ripple/rpc/handlers/UnlList.cpp
src/ripple/rpc/handlers/Unsubscribe.cpp
src/ripple/rpc/handlers/ValidationCreate.cpp
src/ripple/rpc/handlers/ValidatorInfo.cpp
src/ripple/rpc/handlers/ValidatorListSites.cpp
src/ripple/rpc/handlers/Validators.cpp
src/ripple/rpc/handlers/WalletPropose.cpp
src/ripple/rpc/impl/DeliveredAmount.cpp
src/ripple/rpc/impl/Handler.cpp
src/ripple/rpc/impl/GRPCHelpers.cpp
src/ripple/rpc/impl/LegacyPathFind.cpp
src/ripple/rpc/impl/RPCHandler.cpp
src/ripple/rpc/impl/RPCHelpers.cpp
src/ripple/rpc/impl/Role.cpp
src/ripple/rpc/impl/ServerHandlerImp.cpp
src/ripple/rpc/impl/ShardArchiveHandler.cpp
src/ripple/rpc/impl/Status.cpp
src/ripple/rpc/impl/TransactionSign.cpp
#[===============================[
main sources:
subdir: server
#]===============================]
src/ripple/server/impl/JSONRPCUtil.cpp
src/ripple/server/impl/Port.cpp
#[===============================[
main sources:
subdir: shamap
#]===============================]
src/ripple/shamap/impl/SHAMap.cpp
src/ripple/shamap/impl/SHAMapDelta.cpp
src/ripple/shamap/impl/SHAMapItem.cpp
src/ripple/shamap/impl/SHAMapMissingNode.cpp
src/ripple/shamap/impl/SHAMapNodeID.cpp
src/ripple/shamap/impl/SHAMapSync.cpp
src/ripple/shamap/impl/SHAMapTreeNode.cpp
#[===============================[
test sources:
subdir: app
#]===============================]
src/test/app/AccountDelete_test.cpp
src/test/app/AccountTxPaging_test.cpp
src/test/app/AmendmentTable_test.cpp
src/test/app/Check_test.cpp
src/test/app/CrossingLimits_test.cpp
src/test/app/DeliverMin_test.cpp
src/test/app/DepositAuth_test.cpp
src/test/app/Discrepancy_test.cpp
src/test/app/Escrow_test.cpp
src/test/app/FeeVote_test.cpp
src/test/app/Flow_test.cpp
src/test/app/Freeze_test.cpp
src/test/app/HashRouter_test.cpp
src/test/app/LedgerHistory_test.cpp
src/test/app/LedgerLoad_test.cpp
src/test/app/LedgerReplay_test.cpp
src/test/app/LoadFeeTrack_test.cpp
src/test/app/Manifest_test.cpp
src/test/app/MultiSign_test.cpp
src/test/app/OfferStream_test.cpp
src/test/app/Offer_test.cpp
src/test/app/OversizeMeta_test.cpp
src/test/app/Path_test.cpp
src/test/app/PayChan_test.cpp
src/test/app/PayStrand_test.cpp
src/test/app/PseudoTx_test.cpp
src/test/app/RCLCensorshipDetector_test.cpp
src/test/app/RCLValidations_test.cpp
src/test/app/Regression_test.cpp
src/test/app/SHAMapStore_test.cpp
src/test/app/SetAuth_test.cpp
src/test/app/SetRegularKey_test.cpp
src/test/app/SetTrust_test.cpp
src/test/app/Taker_test.cpp
src/test/app/TheoreticalQuality_test.cpp
src/test/app/Ticket_test.cpp
src/test/app/Transaction_ordering_test.cpp
src/test/app/TrustAndBalance_test.cpp
src/test/app/TxQ_test.cpp
src/test/app/ValidatorKeys_test.cpp
src/test/app/ValidatorList_test.cpp
src/test/app/ValidatorSite_test.cpp
src/test/app/tx/apply_test.cpp
#[===============================[
test sources:
subdir: basics
#]===============================]
src/test/basics/Buffer_test.cpp
src/test/basics/DetectCrash_test.cpp
src/test/basics/FileUtilities_test.cpp
src/test/basics/IOUAmount_test.cpp
src/test/basics/KeyCache_test.cpp
src/test/basics/PerfLog_test.cpp
src/test/basics/RangeSet_test.cpp
src/test/basics/Slice_test.cpp
src/test/basics/StringUtilities_test.cpp
src/test/basics/TaggedCache_test.cpp
src/test/basics/XRPAmount_test.cpp
src/test/basics/base64_test.cpp
src/test/basics/base_uint_test.cpp
src/test/basics/contract_test.cpp
src/test/basics/FeeUnits_test.cpp
src/test/basics/hardened_hash_test.cpp
src/test/basics/mulDiv_test.cpp
src/test/basics/qalloc_test.cpp
src/test/basics/tagged_integer_test.cpp
#[===============================[
test sources:
subdir: beast
#]===============================]
src/test/beast/IPEndpoint_test.cpp
src/test/beast/LexicalCast_test.cpp
src/test/beast/SemanticVersion_test.cpp
src/test/beast/aged_associative_container_test.cpp
src/test/beast/beast_CurrentThreadName_test.cpp
src/test/beast/beast_Journal_test.cpp
src/test/beast/beast_PropertyStream_test.cpp
src/test/beast/beast_Zero_test.cpp
src/test/beast/beast_abstract_clock_test.cpp
src/test/beast/beast_basic_seconds_clock_test.cpp
src/test/beast/beast_io_latency_probe_test.cpp
src/test/beast/define_print.cpp
#[===============================[
test sources:
subdir: conditions
#]===============================]
src/test/conditions/PreimageSha256_test.cpp
#[===============================[
test sources:
subdir: consensus
#]===============================]
src/test/consensus/ByzantineFailureSim_test.cpp
src/test/consensus/Consensus_test.cpp
src/test/consensus/DistributedValidatorsSim_test.cpp
src/test/consensus/LedgerTiming_test.cpp
src/test/consensus/LedgerTrie_test.cpp
src/test/consensus/ScaleFreeSim_test.cpp
src/test/consensus/Validations_test.cpp
#[===============================[
test sources:
subdir: core
#]===============================]
src/test/core/ClosureCounter_test.cpp
src/test/core/Config_test.cpp
src/test/core/Coroutine_test.cpp
src/test/core/CryptoPRNG_test.cpp
src/test/core/JobQueue_test.cpp
src/test/core/SociDB_test.cpp
src/test/core/Stoppable_test.cpp
src/test/core/Workers_test.cpp
#[===============================[
test sources:
subdir: crypto
#]===============================]
src/test/crypto/Openssl_test.cpp
#[===============================[
test sources:
subdir: csf
#]===============================]
src/test/csf/BasicNetwork_test.cpp
src/test/csf/Digraph_test.cpp
src/test/csf/Histogram_test.cpp
src/test/csf/Scheduler_test.cpp
src/test/csf/impl/Sim.cpp
src/test/csf/impl/ledgers.cpp
#[===============================[
test sources:
subdir: json
#]===============================]
src/test/json/Object_test.cpp
src/test/json/Output_test.cpp
src/test/json/Writer_test.cpp
src/test/json/json_value_test.cpp
#[===============================[
test sources:
subdir: jtx
#]===============================]
src/test/jtx/Env_test.cpp
src/test/jtx/WSClient_test.cpp
src/test/jtx/impl/Account.cpp
src/test/jtx/impl/Env.cpp
src/test/jtx/impl/JSONRPCClient.cpp
src/test/jtx/impl/ManualTimeKeeper.cpp
src/test/jtx/impl/WSClient.cpp
src/test/jtx/impl/acctdelete.cpp
src/test/jtx/impl/account_txn_id.cpp
src/test/jtx/impl/amount.cpp
src/test/jtx/impl/balance.cpp
src/test/jtx/impl/check.cpp
src/test/jtx/impl/delivermin.cpp
src/test/jtx/impl/deposit.cpp
src/test/jtx/impl/envconfig.cpp
src/test/jtx/impl/fee.cpp
src/test/jtx/impl/flags.cpp
src/test/jtx/impl/invoice_id.cpp
src/test/jtx/impl/jtx_json.cpp
src/test/jtx/impl/last_ledger_sequence.cpp
src/test/jtx/impl/memo.cpp
src/test/jtx/impl/multisign.cpp
src/test/jtx/impl/offer.cpp
src/test/jtx/impl/owners.cpp
src/test/jtx/impl/paths.cpp
src/test/jtx/impl/pay.cpp
src/test/jtx/impl/quality2.cpp
src/test/jtx/impl/rate.cpp
src/test/jtx/impl/regkey.cpp
src/test/jtx/impl/sendmax.cpp
src/test/jtx/impl/seq.cpp
src/test/jtx/impl/sig.cpp
src/test/jtx/impl/tag.cpp
src/test/jtx/impl/ticket.cpp
src/test/jtx/impl/trust.cpp
src/test/jtx/impl/txflags.cpp
src/test/jtx/impl/utility.cpp
#[===============================[
test sources:
subdir: ledger
#]===============================]
src/test/ledger/BookDirs_test.cpp
src/test/ledger/CashDiff_test.cpp
src/test/ledger/Directory_test.cpp
src/test/ledger/Invariants_test.cpp
src/test/ledger/PaymentSandbox_test.cpp
src/test/ledger/PendingSaves_test.cpp
src/test/ledger/SkipList_test.cpp
src/test/ledger/View_test.cpp
#[===============================[
test sources:
subdir: net
#]===============================]
src/test/net/SSLHTTPDownloader_test.cpp
#[===============================[
test sources:
subdir: nodestore
#]===============================]
src/test/nodestore/Backend_test.cpp
src/test/nodestore/Basics_test.cpp
src/test/nodestore/Database_test.cpp
src/test/nodestore/Timing_test.cpp
src/test/nodestore/import_test.cpp
src/test/nodestore/varint_test.cpp
#[===============================[
test sources:
subdir: overlay
#]===============================]
src/test/overlay/ProtocolVersion_test.cpp
src/test/overlay/cluster_test.cpp
src/test/overlay/short_read_test.cpp
#[===============================[
test sources:
subdir: peerfinder
#]===============================]
src/test/peerfinder/Livecache_test.cpp
src/test/peerfinder/PeerFinder_test.cpp
#[===============================[
test sources:
subdir: protocol
#]===============================]
src/test/protocol/InnerObjectFormats_test.cpp
src/test/protocol/Issue_test.cpp
src/test/protocol/PublicKey_test.cpp
src/test/protocol/Quality_test.cpp
src/test/protocol/STAccount_test.cpp
src/test/protocol/STAmount_test.cpp
src/test/protocol/STObject_test.cpp
src/test/protocol/STTx_test.cpp
src/test/protocol/STValidation_test.cpp
src/test/protocol/SecretKey_test.cpp
src/test/protocol/Seed_test.cpp
src/test/protocol/TER_test.cpp
src/test/protocol/digest_test.cpp
src/test/protocol/types_test.cpp
#[===============================[
test sources:
subdir: resource
#]===============================]
src/test/resource/Logic_test.cpp
#[===============================[
test sources:
subdir: rpc
#]===============================]
src/test/rpc/AccountCurrencies_test.cpp
src/test/rpc/AccountInfo_test.cpp
src/test/rpc/AccountLinesRPC_test.cpp
src/test/rpc/AccountObjects_test.cpp
src/test/rpc/AccountOffers_test.cpp
src/test/rpc/AccountSet_test.cpp
src/test/rpc/AccountTx_test.cpp
src/test/rpc/AmendmentBlocked_test.cpp
src/test/rpc/Book_test.cpp
src/test/rpc/DepositAuthorized_test.cpp
src/test/rpc/DeliveredAmount_test.cpp
src/test/rpc/Feature_test.cpp
src/test/rpc/Fee_test.cpp
src/test/rpc/GatewayBalances_test.cpp
src/test/rpc/GetCounts_test.cpp
src/test/rpc/JSONRPC_test.cpp
src/test/rpc/KeyGeneration_test.cpp
src/test/rpc/LedgerClosed_test.cpp
src/test/rpc/LedgerData_test.cpp
src/test/rpc/LedgerRPC_test.cpp
src/test/rpc/LedgerRequestRPC_test.cpp
src/test/rpc/ManifestRPC_test.cpp
src/test/rpc/NoRippleCheck_test.cpp
src/test/rpc/NoRipple_test.cpp
src/test/rpc/OwnerInfo_test.cpp
src/test/rpc/Peers_test.cpp
src/test/rpc/Roles_test.cpp
src/test/rpc/RPCCall_test.cpp
src/test/rpc/RPCOverload_test.cpp
src/test/rpc/RobustTransaction_test.cpp
src/test/rpc/ServerInfo_test.cpp
src/test/rpc/Status_test.cpp
src/test/rpc/Submit_test.cpp
src/test/rpc/Subscribe_test.cpp
src/test/rpc/Transaction_test.cpp
src/test/rpc/TransactionEntry_test.cpp
src/test/rpc/TransactionHistory_test.cpp
src/test/rpc/Tx_test.cpp
src/test/rpc/ValidatorInfo_test.cpp
src/test/rpc/ValidatorRPC_test.cpp
src/test/rpc/Version_test.cpp
#[===============================[
test sources:
subdir: server
#]===============================]
src/test/server/ServerStatus_test.cpp
src/test/server/Server_test.cpp
#[===============================[
test sources:
subdir: shamap
#]===============================]
src/test/shamap/FetchPack_test.cpp
src/test/shamap/SHAMapSync_test.cpp
src/test/shamap/SHAMap_test.cpp
#[===============================[
test sources:
subdir: unit_test
#]===============================]
src/test/unit_test/multi_runner.cpp)
target_link_libraries (rippled
Ripple::boost
Ripple::opts
Ripple::libs
Ripple::xrpl_core
)
exclude_if_included (rippled)
# define a macro for tests that might need to
# be exluded or run differently in CI environment
if (is_ci)
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
endif ()
if (CMAKE_VERSION VERSION_GREATER_EQUAL 3.16)
# any files that don't play well with unity should be added here
set_source_files_properties(
# this one seems to produce conflicts in beast teardown template methods:
src/test/rpc/ValidatorRPC_test.cpp
PROPERTIES SKIP_UNITY_BUILD_INCLUSION TRUE)
endif ()

View File

@@ -0,0 +1,98 @@
#[===================================================================[
coverage report target
#]===================================================================]
if (coverage)
if (is_clang)
if (APPLE)
execute_process (COMMAND xcrun -f llvm-profdata
OUTPUT_VARIABLE LLVM_PROFDATA
OUTPUT_STRIP_TRAILING_WHITESPACE)
else ()
find_program (LLVM_PROFDATA llvm-profdata)
endif ()
if (NOT LLVM_PROFDATA)
message (WARNING "unable to find llvm-profdata - skipping coverage_report target")
endif ()
if (APPLE)
execute_process (COMMAND xcrun -f llvm-cov
OUTPUT_VARIABLE LLVM_COV
OUTPUT_STRIP_TRAILING_WHITESPACE)
else ()
find_program (LLVM_COV llvm-cov)
endif ()
if (NOT LLVM_COV)
message (WARNING "unable to find llvm-cov - skipping coverage_report target")
endif ()
set (extract_pattern "")
if (coverage_core_only)
set (extract_pattern "${CMAKE_SOURCE_DIR}/src/ripple/")
endif ()
if (LLVM_COV AND LLVM_PROFDATA)
add_custom_target (coverage_report
USES_TERMINAL
COMMAND ${CMAKE_COMMAND} -E echo "Generating coverage - results will be in ${CMAKE_BINARY_DIR}/coverage/index.html."
COMMAND ${CMAKE_COMMAND} -E echo "Running rippled tests."
COMMAND rippled --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --quiet --unittest-log
COMMAND ${LLVM_PROFDATA}
merge -sparse default.profraw -o rip.profdata
COMMAND ${CMAKE_COMMAND} -E echo "Summary of coverage:"
COMMAND ${LLVM_COV}
report -instr-profile=rip.profdata
$<TARGET_FILE:rippled> ${extract_pattern}
# generate html report
COMMAND ${LLVM_COV}
show -format=html -output-dir=${CMAKE_BINARY_DIR}/coverage
-instr-profile=rip.profdata
$<TARGET_FILE:rippled> ${extract_pattern}
BYPRODUCTS coverage/index.html)
endif ()
elseif (is_gcc)
find_program (LCOV lcov)
if (NOT LCOV)
message (WARNING "unable to find lcov - skipping coverage_report target")
endif ()
find_program (GENHTML genhtml)
if (NOT GENHTML)
message (WARNING "unable to find genhtml - skipping coverage_report target")
endif ()
set (extract_pattern "*")
if (coverage_core_only)
set (extract_pattern "*/src/ripple/*")
endif ()
if (LCOV AND GENHTML)
add_custom_target (coverage_report
USES_TERMINAL
COMMAND ${CMAKE_COMMAND} -E echo "Generating coverage- results will be in ${CMAKE_BINARY_DIR}/coverage/index.html."
# create baseline info file
COMMAND ${LCOV}
--no-external -d "${CMAKE_SOURCE_DIR}" -c -d . -i -o baseline.info
| grep -v "ignoring data for external file"
# run tests
COMMAND ${CMAKE_COMMAND} -E echo "Running rippled tests for coverage report."
COMMAND rippled --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --quiet --unittest-log
# Create test coverage data file
COMMAND ${LCOV}
--no-external -d "${CMAKE_SOURCE_DIR}" -c -d . -o tests.info
| grep -v "ignoring data for external file"
# Combine baseline and test coverage data
COMMAND ${LCOV}
-a baseline.info -a tests.info -o lcov-all.info
# extract our files
COMMAND ${LCOV}
-e lcov-all.info "${extract_pattern}" -o lcov.info
COMMAND ${CMAKE_COMMAND} -E echo "Summary of coverage:"
COMMAND ${LCOV} --summary lcov.info
# generate HTML report
COMMAND ${GENHTML}
-o ${CMAKE_BINARY_DIR}/coverage lcov.info
BYPRODUCTS coverage/index.html)
endif ()
endif ()
endif ()

View File

@@ -0,0 +1,31 @@
#[===================================================================[
docs target (optional)
#]===================================================================]
find_package (Doxygen)
if (TARGET Doxygen::doxygen)
set (doc_srcs docs/source.dox)
file (GLOB_RECURSE other_docs docs/*.md)
list (APPEND doc_srcs "${other_docs}")
# read the source config and make a modified one
# that points the output files to our build directory
file (READ "${CMAKE_CURRENT_SOURCE_DIR}/docs/source.dox" dox_content)
string (REGEX REPLACE "[\t ]*OUTPUT_DIRECTORY[\t ]*=(.*)"
"OUTPUT_DIRECTORY=${CMAKE_BINARY_DIR}\n\\1"
new_config "${dox_content}")
file (WRITE "${CMAKE_BINARY_DIR}/source.dox" "${new_config}")
add_custom_target (docs
COMMAND "${DOXYGEN_EXECUTABLE}" "${CMAKE_BINARY_DIR}/source.dox"
BYPRODUCTS "${CMAKE_BINARY_DIR}/html_doc/index.html"
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}/docs"
SOURCES "${doc_srcs}")
if (is_multiconfig)
set_property (
SOURCE ${doc_srcs}
APPEND
PROPERTY HEADER_FILE_ONLY
true)
endif ()
else ()
message (STATUS "doxygen executable not found -- skipping docs target")
endif ()

View File

@@ -0,0 +1,62 @@
#[===================================================================[
install stuff
#]===================================================================]
install (
TARGETS
ed25519-donna
common
opts
ripple_syslibs
ripple_boost
xrpl_core
EXPORT RippleExports
LIBRARY DESTINATION lib
ARCHIVE DESTINATION lib
RUNTIME DESTINATION bin
INCLUDES DESTINATION include)
if(${INSTALL_SECP256K1})
install (
TARGETS
secp256k1
EXPORT RippleExports
LIBRARY DESTINATION lib
ARCHIVE DESTINATION lib
RUNTIME DESTINATION bin
INCLUDES DESTINATION include)
endif()
install (EXPORT RippleExports
FILE RippleTargets.cmake
NAMESPACE Ripple::
DESTINATION lib/cmake/ripple)
include (CMakePackageConfigHelpers)
write_basic_package_version_file (
RippleConfigVersion.cmake
VERSION ${rippled_version}
COMPATIBILITY SameMajorVersion)
if (is_root_project)
install (TARGETS rippled RUNTIME DESTINATION bin)
set_target_properties(rippled PROPERTIES INSTALL_RPATH_USE_LINK_PATH ON)
install (
FILES
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/RippleConfig.cmake
${CMAKE_CURRENT_BINARY_DIR}/RippleConfigVersion.cmake
DESTINATION lib/cmake/ripple)
# sample configs should not overwrite existing files
# install if-not-exists workaround as suggested by
# https://cmake.org/Bug/view.php?id=12646
install(CODE "
macro (copy_if_not_exists SRC DEST NEWNAME)
if (NOT EXISTS \"\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/\${DEST}/\${NEWNAME}\")
file (INSTALL FILE_PERMISSIONS OWNER_READ OWNER_WRITE DESTINATION \"\${CMAKE_INSTALL_PREFIX}/\${DEST}\" FILES \"\${SRC}\" RENAME \"\${NEWNAME}\")
else ()
message (\"-- Skipping : \$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/\${DEST}/\${NEWNAME}\")
endif ()
endmacro()
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/rippled-example.cfg\" etc rippled.cfg)
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/validators-example.txt\" etc validators.txt)
")
endif ()

View File

@@ -0,0 +1,108 @@
#[===================================================================[
rippled compile options/settings via an interface library
#]===================================================================]
add_library (opts INTERFACE)
add_library (Ripple::opts ALIAS opts)
target_compile_definitions (opts
INTERFACE
BOOST_ASIO_DISABLE_HANDLER_TYPE_REQUIREMENTS
$<$<BOOL:${boost_show_deprecated}>:
BOOST_ASIO_NO_DEPRECATED
BOOST_FILESYSTEM_NO_DEPRECATED
>
$<$<NOT:$<BOOL:${boost_show_deprecated}>>:
BOOST_COROUTINES_NO_DEPRECATION_WARNING
BOOST_BEAST_ALLOW_DEPRECATED
BOOST_FILESYSTEM_DEPRECATED
>
$<$<BOOL:${beast_hashers}>:
USE_BEAST_HASHER
>
$<$<BOOL:${beast_no_unit_test_inline}>:BEAST_NO_UNIT_TEST_INLINE=1>
$<$<BOOL:${beast_disable_autolink}>:BEAST_DONT_AUTOLINK_TO_WIN32_LIBRARIES=1>
$<$<BOOL:${single_io_service_thread}>:RIPPLE_SINGLE_IO_SERVICE_THREAD=1>
# doesn't currently compile ? :
$<$<BOOL:${verify_nodeobject_keys}>:RIPPLE_VERIFY_NODEOBJECT_KEYS=1>)
target_compile_options (opts
INTERFACE
$<$<AND:$<BOOL:${is_gcc}>,$<COMPILE_LANGUAGE:CXX>>:-Wsuggest-override>
$<$<BOOL:${perf}>:-fno-omit-frame-pointer>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-fprofile-arcs -ftest-coverage>
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-fprofile-instr-generate -fcoverage-mapping>
$<$<BOOL:${profile}>:-pg>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
target_link_libraries (opts
INTERFACE
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-fprofile-arcs -ftest-coverage>
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-fprofile-instr-generate -fcoverage-mapping>
$<$<BOOL:${profile}>:-pg>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
if (jemalloc)
if (static)
set(JEMALLOC_USE_STATIC ON CACHE BOOL "" FORCE)
endif ()
find_package (jemalloc REQUIRED)
target_compile_definitions (opts INTERFACE PROFILE_JEMALLOC)
target_include_directories (opts SYSTEM INTERFACE ${JEMALLOC_INCLUDE_DIRS})
target_link_libraries (opts INTERFACE ${JEMALLOC_LIBRARIES})
get_filename_component (JEMALLOC_LIB_PATH ${JEMALLOC_LIBRARIES} DIRECTORY)
## TODO see if we can use the BUILD_RPATH target property (is it transitive?)
set (CMAKE_BUILD_RPATH ${CMAKE_BUILD_RPATH} ${JEMALLOC_LIB_PATH})
endif ()
if (san)
target_compile_options (opts
INTERFACE
# sanitizers recommend minimum of -O1 for reasonable performance
$<$<CONFIG:Debug>:-O1>
${SAN_FLAG}
-fno-omit-frame-pointer)
target_compile_definitions (opts
INTERFACE
$<$<STREQUAL:${san},address>:SANITIZER=ASAN>
$<$<STREQUAL:${san},thread>:SANITIZER=TSAN>
$<$<STREQUAL:${san},memory>:SANITIZER=MSAN>
$<$<STREQUAL:${san},undefined>:SANITIZER=UBSAN>)
target_link_libraries (opts INTERFACE ${SAN_FLAG} ${SAN_LIB})
endif ()
#[===================================================================[
rippled transitive library deps via an interface library
#]===================================================================]
add_library (ripple_syslibs INTERFACE)
add_library (Ripple::syslibs ALIAS ripple_syslibs)
target_link_libraries (ripple_syslibs
INTERFACE
$<$<BOOL:${MSVC}>:
legacy_stdio_definitions.lib
Shlwapi
kernel32
user32
gdi32
winspool
comdlg32
advapi32
shell32
ole32
oleaut32
uuid
odbc32
odbccp32
crypt32
>
$<$<NOT:$<BOOL:${MSVC}>>:dl>
$<$<NOT:$<OR:$<BOOL:${MSVC}>,$<BOOL:${APPLE}>>>:rt>)
if (NOT MSVC)
set (THREADS_PREFER_PTHREAD_FLAG ON)
find_package (Threads)
target_link_libraries (ripple_syslibs INTERFACE Threads::Threads)
endif ()
add_library (ripple_libs INTERFACE)
add_library (Ripple::libs ALIAS ripple_libs)
target_link_libraries (ripple_libs INTERFACE Ripple::syslibs)

View File

@@ -0,0 +1,39 @@
#[===================================================================[
multiconfig misc
#]===================================================================]
if (is_multiconfig)
# This code finds all source files in the src subdirectory for inclusion
# in the IDE file tree as non-compiled sources. Since this file list will
# have some overlap with files we have already added to our targets to
# be compiled, we explicitly remove any of these target source files from
# this list.
file (GLOB_RECURSE all_sources RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
CONFIGURE_DEPENDS
src/*.* Builds/*.md docs/*.md src/*.md Builds/*.cmake)
file(GLOB md_files RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} CONFIGURE_DEPENDS
*.md)
LIST(APPEND all_sources ${md_files})
foreach (_target secp256k1 ed25519-donna pbufs xrpl_core rippled)
get_target_property (_type ${_target} TYPE)
if(_type STREQUAL "INTERFACE_LIBRARY")
continue()
endif()
get_target_property (_src ${_target} SOURCES)
list (REMOVE_ITEM all_sources ${_src})
endforeach ()
target_sources (rippled PRIVATE ${all_sources})
set_property (
SOURCE ${all_sources}
APPEND
PROPERTY HEADER_FILE_ONLY true)
if (MSVC)
set_property(
DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
PROPERTY VS_STARTUP_PROJECT rippled)
endif ()
group_sources(src)
group_sources(docs)
group_sources(Builds)
endif ()

View File

@@ -0,0 +1,33 @@
#[===================================================================[
NIH prefix path..this is where we will download
and build any ExternalProjects, and they will hopefully
survive across build directory deletion (manual cleans)
#]===================================================================]
string (REGEX REPLACE "[ \\/%]+" "_" gen_for_path ${CMAKE_GENERATOR})
string (TOLOWER ${gen_for_path} gen_for_path)
# HACK: trying to shorten paths for windows CI (which hits 260 MAXPATH easily)
# @see: https://issues.jenkins-ci.org/browse/JENKINS-38706?focusedCommentId=339847
string (REPLACE "visual_studio" "vs" gen_for_path ${gen_for_path})
if (NOT DEFINED NIH_CACHE_ROOT)
if (DEFINED ENV{NIH_CACHE_ROOT})
set (NIH_CACHE_ROOT $ENV{NIH_CACHE_ROOT})
else ()
set (NIH_CACHE_ROOT "${CMAKE_SOURCE_DIR}/.nih_c")
endif ()
endif ()
set (nih_cache_path
"${NIH_CACHE_ROOT}/${gen_for_path}/${CMAKE_CXX_COMPILER_ID}_${CMAKE_CXX_COMPILER_VERSION}")
if (NOT is_multiconfig)
set (nih_cache_path "${nih_cache_path}/${CMAKE_BUILD_TYPE}")
endif ()
file(TO_CMAKE_PATH "${nih_cache_path}" nih_cache_path)
message (STATUS "NIH-EP cache path: ${nih_cache_path}")
## two convenience variables:
set (ep_lib_prefix ${CMAKE_STATIC_LIBRARY_PREFIX})
set (ep_lib_suffix ${CMAKE_STATIC_LIBRARY_SUFFIX})
# this is a setting for FetchContent and needs to be
# a cache variable
# https://cmake.org/cmake/help/latest/module/FetchContent.html#populating-the-content
set (FETCHCONTENT_BASE_DIR ${nih_cache_path} CACHE STRING "" FORCE)

View File

@@ -0,0 +1,190 @@
#[===================================================================[
package/container targets - (optional)
#]===================================================================]
if (is_root_project)
if (NOT DOCKER)
find_program (DOCKER docker)
endif ()
if (DOCKER)
# if no container label is provided, use current git hash
git_hash (commit_hash)
if (NOT container_label)
set (container_label ${commit_hash})
endif ()
message (STATUS "using [${container_label}] as build container tag...")
file (MAKE_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/packages)
file (MAKE_DIRECTORY ${NIH_CACHE_ROOT}/pkgbuild)
if (is_linux)
execute_process (COMMAND id -u
OUTPUT_VARIABLE DOCKER_USER_ID
OUTPUT_STRIP_TRAILING_WHITESPACE)
message (STATUS "docker local user id: ${DOCKER_USER_ID}")
execute_process (COMMAND id -g
OUTPUT_VARIABLE DOCKER_GROUP_ID
OUTPUT_STRIP_TRAILING_WHITESPACE)
message (STATUS "docker local group id: ${DOCKER_GROUP_ID}")
endif ()
if (DOCKER_USER_ID AND DOCKER_GROUP_ID)
set(map_user TRUE)
endif ()
#[===================================================================[
rpm
#]===================================================================]
add_custom_target (rpm_container
docker build
--pull
--build-arg GIT_COMMIT=${commit_hash}
-t rippled-rpm-builder:${container_label}
$<$<BOOL:${rpm_cache_from}>:--cache-from=${rpm_cache_from}>
-f centos-builder/Dockerfile .
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/Builds/containers
VERBATIM
USES_TERMINAL
COMMAND_EXPAND_LISTS
SOURCES
Builds/containers/centos-builder/Dockerfile
Builds/containers/centos-builder/centos_setup.sh
Builds/containers/centos-builder/extras.sh
Builds/containers/shared/build_deps.sh
Builds/containers/shared/rippled.service
Builds/containers/shared/update_sources.sh
Builds/containers/shared/update-rippled.sh
Builds/containers/packaging/rpm/rippled.spec
Builds/containers/packaging/rpm/build_rpm.sh
bin/getRippledInfo
)
exclude_from_default (rpm_container)
add_custom_target (rpm
docker run
-e NIH_CACHE_ROOT=/opt/rippled_bld/pkg/.nih_c
-v ${NIH_CACHE_ROOT}/pkgbuild:/opt/rippled_bld/pkg/.nih_c
-v ${CMAKE_SOURCE_DIR}:/opt/rippled_bld/pkg/rippled
-v ${CMAKE_CURRENT_BINARY_DIR}/packages:/opt/rippled_bld/pkg/out
"$<$<BOOL:${map_user}>:--volume=/etc/passwd:/etc/passwd;--volume=/etc/group:/etc/group;--user=${DOCKER_USER_ID}:${DOCKER_GROUP_ID}>"
-t rippled-rpm-builder:${container_label}
/bin/bash -c "cp -fpu rippled/Builds/containers/packaging/rpm/build_rpm.sh . && ./build_rpm.sh"
VERBATIM
USES_TERMINAL
COMMAND_EXPAND_LISTS
SOURCES
Builds/containers/packaging/rpm/rippled.spec
)
exclude_from_default (rpm)
if (NOT have_package_container)
add_dependencies(rpm rpm_container)
endif ()
#[===================================================================[
dpkg
#]===================================================================]
# currently use ubuntu 16.04 as a base b/c it has one of
# the lower versions of libc among ubuntu and debian releases.
# we could change this in the future and build with some other deb
# based system.
add_custom_target (dpkg_container
docker build
--pull
--build-arg DIST_TAG=16.04
--build-arg GIT_COMMIT=${commit_hash}
-t rippled-dpkg-builder:${container_label}
$<$<BOOL:${dpkg_cache_from}>:--cache-from=${dpkg_cache_from}>
-f ubuntu-builder/Dockerfile .
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/Builds/containers
VERBATIM
USES_TERMINAL
COMMAND_EXPAND_LISTS
SOURCES
Builds/containers/ubuntu-builder/Dockerfile
Builds/containers/ubuntu-builder/ubuntu_setup.sh
Builds/containers/shared/build_deps.sh
Builds/containers/shared/rippled.service
Builds/containers/shared/update_sources.sh
Builds/containers/shared/update-rippled.sh
Builds/containers/packaging/dpkg/build_dpkg.sh
Builds/containers/packaging/dpkg/debian/README.Debian
Builds/containers/packaging/dpkg/debian/conffiles
Builds/containers/packaging/dpkg/debian/control
Builds/containers/packaging/dpkg/debian/copyright
Builds/containers/packaging/dpkg/debian/dirs
Builds/containers/packaging/dpkg/debian/docs
Builds/containers/packaging/dpkg/debian/rippled-dev.install
Builds/containers/packaging/dpkg/debian/rippled.install
Builds/containers/packaging/dpkg/debian/rippled.links
Builds/containers/packaging/dpkg/debian/rippled.postinst
Builds/containers/packaging/dpkg/debian/rippled.postrm
Builds/containers/packaging/dpkg/debian/rippled.preinst
Builds/containers/packaging/dpkg/debian/rippled.prerm
Builds/containers/packaging/dpkg/debian/rules
bin/getRippledInfo
)
exclude_from_default (dpkg_container)
add_custom_target (dpkg
docker run
-e NIH_CACHE_ROOT=/opt/rippled_bld/pkg/.nih_c
-v ${NIH_CACHE_ROOT}/pkgbuild:/opt/rippled_bld/pkg/.nih_c
-v ${CMAKE_SOURCE_DIR}:/opt/rippled_bld/pkg/rippled
-v ${CMAKE_CURRENT_BINARY_DIR}/packages:/opt/rippled_bld/pkg/out
"$<$<BOOL:${map_user}>:--volume=/etc/passwd:/etc/passwd;--volume=/etc/group:/etc/group;--user=${DOCKER_USER_ID}:${DOCKER_GROUP_ID}>"
-t rippled-dpkg-builder:${container_label}
/bin/bash -c "cp -fpu rippled/Builds/containers/packaging/dpkg/build_dpkg.sh . && ./build_dpkg.sh"
VERBATIM
USES_TERMINAL
COMMAND_EXPAND_LISTS
SOURCES
Builds/containers/packaging/dpkg/debian/control
)
exclude_from_default (dpkg)
if (NOT have_package_container)
add_dependencies(dpkg dpkg_container)
endif ()
#[===================================================================[
ci container
#]===================================================================]
# now use the same ubuntu image for our travis-ci docker images,
# but we use a newer distro (18.04 vs 16.04).
#
# the following steps assume the github pkg repo, but it's possible to
# adapt these for other docker hub repositories.
#
# steps for publishing a new CI image when you make changes:
#
# mkdir bld.ci && cd bld.ci && cmake -Dpackages_only=ON -Dcontainer_label=CI_LATEST
# cmake --build . --target ci_container --verbose
# docker tag rippled-ci-builder:CI_LATEST <HUB REPO PATH>/rippled-ci-builder:YYYY-MM-DD
# (NOTE: change YYYY-MM-DD to match current date, or use a different
# tag/version scheme if you prefer)
# docker push <HUB REPO PATH>/rippled-ci-builder:YYYY-MM-DD
# (NOTE: <HUB REPO PATH> is probably your user or org name if using
# docker hub, or it might be something like
# docker.pkg.github.com/ripple/rippled if using the github pkg
# registry. for any registry, you will need to be logged-in via
# docker and have push access.)
#
# ...then change the DOCKER_IMAGE line in .travis.yml :
# - DOCKER_IMAGE="<HUB REPO PATH>/rippled-ci-builder:YYYY-MM-DD"
add_custom_target (ci_container
docker build
--pull
--build-arg DIST_TAG=18.04
--build-arg GIT_COMMIT=${commit_hash}
--build-arg CI_USE=true
-t rippled-ci-builder:${container_label}
$<$<BOOL:${ci_cache_from}>:--cache-from=${ci_cache_from}>
-f ubuntu-builder/Dockerfile .
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/Builds/containers
VERBATIM
USES_TERMINAL
COMMAND_EXPAND_LISTS
SOURCES
Builds/containers/ubuntu-builder/Dockerfile
Builds/containers/ubuntu-builder/ubuntu_setup.sh
Builds/containers/shared/build_deps.sh
)
exclude_from_default (ci_container)
else ()
message (STATUS "docker NOT found -- won't be able to build containers for packaging")
endif ()
endif ()

View File

@@ -0,0 +1,88 @@
#[===================================================================[
convenience variables and sanity checks
#]===================================================================]
if (NOT ep_procs)
ProcessorCount(ep_procs)
if (ep_procs GREATER 1)
# never use more than half of cores for EP builds
math (EXPR ep_procs "${ep_procs} / 2")
message (STATUS "Using ${ep_procs} cores for ExternalProject builds.")
endif ()
endif ()
get_property (is_multiconfig GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
if (is_multiconfig STREQUAL "NOTFOUND")
if (${CMAKE_GENERATOR} STREQUAL "Xcode" OR ${CMAKE_GENERATOR} MATCHES "^Visual Studio")
set (is_multiconfig TRUE)
endif ()
endif ()
set (CMAKE_CONFIGURATION_TYPES "Debug;Release" CACHE STRING "" FORCE)
if (NOT is_multiconfig)
if (NOT CMAKE_BUILD_TYPE)
message (STATUS "Build type not specified - defaulting to Release")
set (CMAKE_BUILD_TYPE Release CACHE STRING "build type" FORCE)
elseif (NOT (CMAKE_BUILD_TYPE STREQUAL Debug OR CMAKE_BUILD_TYPE STREQUAL Release))
# for simplicity, these are the only two config types we care about. Limiting
# the build types simplifies dealing with external project builds especially
message (FATAL_ERROR " *** Only Debug or Release build types are currently supported ***")
endif ()
endif ()
get_directory_property(has_parent PARENT_DIRECTORY)
if (has_parent)
set (is_root_project OFF)
else ()
set (is_root_project ON)
endif ()
if ("${CMAKE_CXX_COMPILER_ID}" MATCHES ".*Clang") # both Clang and AppleClang
set (is_clang TRUE)
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang" AND
CMAKE_CXX_COMPILER_VERSION VERSION_LESS 7.0)
message (FATAL_ERROR "This project requires clang 7 or later")
endif ()
# TODO min AppleClang version check ?
elseif ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
set (is_gcc TRUE)
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 7.0)
message (FATAL_ERROR "This project requires GCC 7 or later")
endif ()
endif ()
if (CMAKE_GENERATOR STREQUAL "Xcode")
set (is_xcode TRUE)
endif ()
if (CMAKE_SYSTEM_NAME STREQUAL "Linux")
set (is_linux TRUE)
else ()
set (is_linux FALSE)
endif ()
if ("$ENV{CI}" STREQUAL "true" OR "$ENV{CONTINUOUS_INTEGRATION}" STREQUAL "true")
set (is_ci TRUE)
else ()
set (is_ci FALSE)
endif ()
# check for in-source build and fail
if ("${CMAKE_CURRENT_SOURCE_DIR}" STREQUAL "${CMAKE_BINARY_DIR}")
message (FATAL_ERROR "Builds (in-source) are not allowed in "
"${CMAKE_CURRENT_SOURCE_DIR}. Please remove CMakeCache.txt and the CMakeFiles "
"directory from ${CMAKE_CURRENT_SOURCE_DIR} and try building in a separate directory.")
endif ()
if ("${CMAKE_GENERATOR}" MATCHES "Visual Studio" AND
NOT ("${CMAKE_GENERATOR}" MATCHES .*Win64.*))
message (FATAL_ERROR
"Visual Studio 32-bit build is not supported. Use -G\"${CMAKE_GENERATOR} Win64\"")
endif ()
if (NOT CMAKE_SIZEOF_VOID_P EQUAL 8)
message (FATAL_ERROR "Rippled requires a 64 bit target architecture.\n"
"The most likely cause of this warning is trying to build rippled with a 32-bit OS.")
endif ()
if (APPLE AND NOT HOMEBREW)
find_program (HOMEBREW brew)
endif ()

View File

@@ -0,0 +1,129 @@
#[===================================================================[
declare user options/settings
#]===================================================================]
option (assert "Enables asserts, even in release builds" OFF)
option (unity "Creates a build using UNITY support in cmake. This is the default" ON)
if (unity)
if (CMAKE_VERSION VERSION_LESS 3.16)
message (WARNING "unity option only supported for with cmake 3.16+ (please upgrade)")
set (unity OFF CACHE BOOL "unity only available for cmake 3.16+" FORCE)
else ()
if (NOT is_ci)
set (CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
endif ()
endif ()
endif ()
if (is_gcc OR is_clang)
option (coverage "Generates coverage info." OFF)
option (profile "Add profiling flags" OFF)
set (coverage_test "" CACHE STRING
"On gcc & clang, the specific unit test(s) to run for coverage. Default is all tests.")
if (coverage_test AND NOT coverage)
set (coverage ON CACHE BOOL "gcc/clang only" FORCE)
endif ()
option (coverage_core_only
"Include only src/ripple files when generating coverage report. \
Set to OFF to include all sources in coverage report."
ON)
option (wextra "compile with extra gcc/clang warnings enabled" ON)
else ()
set (profile OFF CACHE BOOL "gcc/clang only" FORCE)
set (coverage OFF CACHE BOOL "gcc/clang only" FORCE)
set (wextra OFF CACHE BOOL "gcc/clang only" FORCE)
endif ()
if (is_linux)
option (BUILD_SHARED_LIBS "build shared ripple libraries" OFF)
option (static "link protobuf, openssl, libc++, and boost statically" ON)
option (perf "Enables flags that assist with perf recording" OFF)
option (use_gold "enables detection of gold (binutils) linker" ON)
else ()
# we are not ready to allow shared-libs on windows because it would require
# export declarations. On macos it's more feasible, but static openssl
# produces odd linker errors, thus we disable shared lib builds for now.
set (BUILD_SHARED_LIBS OFF CACHE BOOL "build shared ripple libraries - OFF for win/macos" FORCE)
set (static ON CACHE BOOL "static link, linux only. ON for WIN/macos" FORCE)
set (perf OFF CACHE BOOL "perf flags, linux only" FORCE)
set (use_gold OFF CACHE BOOL "gold linker, linux only" FORCE)
endif ()
if (is_clang)
option (use_lld "enables detection of lld linker" ON)
else ()
set (use_lld OFF CACHE BOOL "try lld linker, clang only" FORCE)
endif ()
option (jemalloc "Enables jemalloc for heap profiling" OFF)
option (werr "treat warnings as errors" OFF)
option (local_protobuf
"Force a local build of protobuf instead of looking for an installed version." OFF)
option (local_grpc
"Force a local build of gRPC instead of looking for an installed version." OFF)
# this one is a string and therefore can't be an option
set (san "" CACHE STRING "On gcc & clang, add sanitizer instrumentation")
set_property (CACHE san PROPERTY STRINGS ";undefined;memory;address;thread")
if (san)
string (TOLOWER ${san} san)
set (SAN_FLAG "-fsanitize=${san}")
set (SAN_LIB "")
if (is_gcc)
if (san STREQUAL "address")
set (SAN_LIB "asan")
elseif (san STREQUAL "thread")
set (SAN_LIB "tsan")
elseif (san STREQUAL "memory")
set (SAN_LIB "msan")
elseif (san STREQUAL "undefined")
set (SAN_LIB "ubsan")
endif ()
endif ()
set (_saved_CRL ${CMAKE_REQUIRED_LIBRARIES})
set (CMAKE_REQUIRED_LIBRARIES "${SAN_FLAG};${SAN_LIB}")
check_cxx_compiler_flag (${SAN_FLAG} COMPILER_SUPPORTS_SAN)
set (CMAKE_REQUIRED_LIBRARIES ${_saved_CRL})
if (NOT COMPILER_SUPPORTS_SAN)
message (FATAL_ERROR "${san} sanitizer does not seem to be supported by your compiler")
endif ()
endif ()
set (container_label "" CACHE STRING "tag to use for package building containers")
option (packages_only
"ONLY generate package building targets. This is special use-case and almost \
certainly not what you want. Use with caution as you won't be able to build \
any compiled targets locally." OFF)
option (have_package_container
"Sometimes you already have the tagged container you want to use for package \
building and you don't want docker to rebuild it. This flag will detach the \
dependency of the package build from the container build. It's an advanced \
use case and most likely you should not be touching this flag." OFF)
# the remaining options are obscure and rarely used
option (beast_no_unit_test_inline
"Prevents unit test definitions from being inserted into global table"
OFF)
# NOTE - THIS OPTION CURRENTLY DOES NOT COMPILE :
# TODO: fix or remove
option (verify_nodeobject_keys
"This verifies that the hash of node objects matches the payload. \
This check is expensive - use with caution."
OFF)
option (single_io_service_thread
"Restricts the number of threads calling io_service::run to one. \
This can be useful when debugging."
OFF)
option (boost_show_deprecated
"Allow boost to fail on deprecated usage. Only useful if you're trying\
to find deprecated calls."
OFF)
option (beast_hashers
"Use local implementations for sha/ripemd hashes (experimental, not recommended)"
OFF)
if (WIN32)
option (beast_disable_autolink "Disables autolinking of system libraries on WIN32" OFF)
else ()
set (beast_disable_autolink OFF CACHE BOOL "WIN32 only" FORCE)
endif ()
if (coverage)
message (STATUS "coverage build requested - forcing Debug build")
set (CMAKE_BUILD_TYPE Debug CACHE STRING "build type" FORCE)
endif ()

View File

@@ -0,0 +1,24 @@
option (validator_keys "Enables building of validator-keys-tool as a separate target (imported via FetchContent)" OFF)
if (validator_keys AND CMAKE_VERSION VERSION_GREATER_EQUAL 3.11)
git_branch (current_branch)
# default to tracking VK develop branch unless we are on master/release
if (NOT (current_branch STREQUAL "master" OR current_branch STREQUAL "release"))
set (current_branch "develop")
endif ()
message (STATUS "tracking ValidatorKeys branch: ${current_branch}")
FetchContent_Declare (
validator_keys_src
GIT_REPOSITORY https://github.com/ripple/validator-keys-tool.git
GIT_TAG "${current_branch}"
)
FetchContent_GetProperties (validator_keys_src)
if (NOT validator_keys_src_POPULATED)
message (STATUS "Pausing to download ValidatorKeys...")
FetchContent_Populate (validator_keys_src)
endif ()
add_subdirectory (${validator_keys_src_SOURCE_DIR} ${CMAKE_BINARY_DIR}/validator-keys)
endif ()

View File

@@ -0,0 +1,15 @@
#[===================================================================[
read version from source
#]===================================================================]
file (STRINGS src/ripple/protocol/impl/BuildInfo.cpp BUILD_INFO)
foreach (line_ ${BUILD_INFO})
if (line_ MATCHES "versionString[ ]*=[ ]*\"(.+)\"")
set (rippled_version ${CMAKE_MATCH_1})
endif ()
endforeach ()
if (rippled_version)
message (STATUS "rippled version: ${rippled_version}")
else ()
message (FATAL_ERROR "unable to determine rippled version")
endif ()

View File

@@ -0,0 +1,99 @@
#[===================================================================[
NIH dep: boost
#]===================================================================]
if ((NOT DEFINED BOOST_ROOT) AND (DEFINED ENV{BOOST_ROOT}))
set (BOOST_ROOT $ENV{BOOST_ROOT})
endif ()
file (TO_CMAKE_PATH "${BOOST_ROOT}" BOOST_ROOT)
if (WIN32 OR CYGWIN)
# Workaround for MSVC having two boost versions - x86 and x64 on same PC in stage folders
if (DEFINED BOOST_ROOT)
if (IS_DIRECTORY ${BOOST_ROOT}/stage64/lib)
set (BOOST_LIBRARYDIR ${BOOST_ROOT}/stage64/lib)
else ()
set (BOOST_LIBRARYDIR ${BOOST_ROOT}/stage/lib)
endif ()
endif ()
endif ()
message (STATUS "BOOST_ROOT: ${BOOST_ROOT}")
message (STATUS "BOOST_LIBRARYDIR: ${BOOST_LIBRARYDIR}")
# uncomment the following as needed to debug FindBoost issues:
#set (Boost_DEBUG ON)
#[=========================================================[
boost dynamic libraries don't trivially support @rpath
linking right now (cmake's default), so just force
static linking for macos, or if requested on linux by flag
#]=========================================================]
if (static)
set (Boost_USE_STATIC_LIBS ON)
endif ()
set (Boost_USE_MULTITHREADED ON)
if (static AND NOT APPLE)
set (Boost_USE_STATIC_RUNTIME ON)
else ()
set (Boost_USE_STATIC_RUNTIME OFF)
endif ()
# TBD:
# Boost_USE_DEBUG_RUNTIME: When ON, uses Boost libraries linked against the
find_package (Boost 1.70 REQUIRED
COMPONENTS
chrono
context
coroutine
date_time
filesystem
program_options
regex
serialization
system
thread)
add_library (ripple_boost INTERFACE)
add_library (Ripple::boost ALIAS ripple_boost)
if (is_xcode)
target_include_directories (ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options (ripple_boost INTERFACE --system-header-prefix="boost/")
else ()
target_include_directories (ripple_boost SYSTEM BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
endif()
target_link_libraries (ripple_boost
INTERFACE
Boost::boost
Boost::chrono
Boost::coroutine
Boost::date_time
Boost::filesystem
Boost::program_options
Boost::regex
Boost::serialization
Boost::system
Boost::thread)
if (Boost_COMPILER)
target_link_libraries (ripple_boost INTERFACE Boost::disable_autolinking)
endif ()
if (san AND is_clang)
# TODO: gcc does not support -fsanitize-blacklist...can we do something else
# for gcc ?
if (NOT Boost_INCLUDE_DIRS AND TARGET Boost::headers)
get_target_property (Boost_INCLUDE_DIRS Boost::headers INTERFACE_INCLUDE_DIRECTORIES)
endif ()
message(STATUS "Adding [${Boost_INCLUDE_DIRS}] to sanitizer blacklist")
file (WRITE ${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt "src:${Boost_INCLUDE_DIRS}/*")
target_compile_options (opts
INTERFACE
# ignore boost headers for sanitizing
-fsanitize-blacklist=${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt)
endif ()
# workaround for xcode 10.2 and boost < 1.69
# once we require Boost 1.69 or higher, this can be removed
# see: https://github.com/boostorg/asio/commit/43874d5
if (CMAKE_CXX_COMPILER_ID STREQUAL "AppleClang" AND
CMAKE_CXX_COMPILER_VERSION VERSION_GREATER_EQUAL 10.0.1.10010043 AND
Boost_VERSION LESS 106900)
target_compile_definitions (opts INTERFACE BOOST_ASIO_HAS_STD_STRING_VIEW)
endif ()

View File

@@ -0,0 +1,28 @@
#[===================================================================[
NIH dep: ed25519-donna
#]===================================================================]
add_library (ed25519-donna STATIC
src/ed25519-donna/ed25519.c)
target_include_directories (ed25519-donna
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
$<INSTALL_INTERFACE:include>
PRIVATE
${CMAKE_CURRENT_SOURCE_DIR}/src/ed25519-donna)
#[=========================================================[
NOTE for macos:
https://github.com/floodyberry/ed25519-donna/issues/29
our source for ed25519-donna-portable.h has been
patched to workaround this.
#]=========================================================]
target_link_libraries (ed25519-donna PUBLIC OpenSSL::SSL)
add_library (NIH::ed25519-donna ALIAS ed25519-donna)
target_link_libraries (ripple_libs INTERFACE NIH::ed25519-donna)
#[===========================[
headers installation
#]===========================]
install (
FILES
src/ed25519-donna/ed25519.h
DESTINATION include/ed25519-donna)

View File

@@ -0,0 +1,22 @@
find_package (PkgConfig REQUIRED)
pkg_search_module (libarchive_PC QUIET libarchive>=3.3.3)
if(static)
set(LIBARCHIVE_LIB libarchive.a)
else()
set(LIBARCHIVE_LIB archive)
endif()
find_library (archive
NAMES ${LIBARCHIVE_LIB}
HINTS
${libarchive_PC_LIBDIR}
${libarchive_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LIBARCHIVE_INCLUDE_DIR
NAMES archive.h
HINTS
${libarchive_PC_INCLUDEDIR}
${libarchive_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -0,0 +1,24 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (lz4_PC QUIET liblz4>=1.8)
endif ()
if(static)
set(LZ4_LIB liblz4.a)
else()
set(LZ4_LIB lz4.so)
endif()
find_library (lz4
NAMES ${LZ4_LIB}
HINTS
${lz4_PC_LIBDIR}
${lz4_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LZ4_INCLUDE_DIR
NAMES lz4.h
HINTS
${lz4_PC_INCLUDEDIR}
${lz4_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -0,0 +1,24 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (secp256k1_PC QUIET libsecp256k1)
endif ()
if(static)
set(SECP256K1_LIB libsecp256k1.a)
else()
set(SECP256K1_LIB secp256k1)
endif()
find_library(secp256k1
NAMES ${SECP256K1_LIB}
HINTS
${secp256k1_PC_LIBDIR}
${secp256k1_PC_LIBRARY_PATHS}
NO_DEFAULT_PATH)
find_path (SECP256K1_INCLUDE_DIR
NAMES secp256k1.h
HINTS
${secp256k1_PC_INCLUDEDIR}
${secp256k1_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -0,0 +1,24 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (snappy_PC QUIET snappy>=1.1.7)
endif ()
if(static)
set(SNAPPY_LIB libsnappy.a)
else()
set(SNAPPY_LIB libsnappy.so)
endif()
find_library (snappy
NAMES ${SNAPPY_LIB}
HINTS
${snappy_PC_LIBDIR}
${snappy_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SNAPPY_INCLUDE_DIR
NAMES snappy.h
HINTS
${snappy_PC_INCLUDEDIR}
${snappy_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -0,0 +1,17 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
# TBD - currently no soci pkgconfig
#pkg_search_module (soci_PC QUIET libsoci_core>=3.2)
endif ()
if(static)
set(SOCI_LIB libsoci.a)
else()
set(SOCI_LIB libsoci_core.so)
endif()
find_library (soci
NAMES ${SOCI_LIB})
find_path (SOCI_INCLUDE_DIR
NAMES soci/soci.h)

View File

@@ -0,0 +1,24 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (sqlite_PC QUIET sqlite3>=3.26.0)
endif ()
if(static)
set(SQLITE_LIB libsqlite3.a)
else()
set(SQLITE_LIB sqlite3.so)
endif()
find_library (sqlite3
NAMES ${SQLITE_LIB}
HINTS
${sqlite_PC_LIBDIR}
${sqlite_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SQLITE_INCLUDE_DIR
NAMES sqlite3.h
HINTS
${sqlite_PC_INCLUDEDIR}
${sqlite_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -0,0 +1,163 @@
#[===================================================================[
NIH dep: libarchive
#]===================================================================]
option (local_libarchive "use local build of libarchive." OFF)
add_library (archive_lib UNKNOWN IMPORTED GLOBAL)
if (NOT local_libarchive)
if (NOT WIN32)
find_package(libarchive_pc REQUIRED)
endif ()
if (archive)
message (STATUS "Found libarchive using pkg-config. Using ${archive}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${archive}
IMPORTED_LOCATION_RELEASE
${archive}
INTERFACE_INCLUDE_DIRECTORIES
${LIBARCHIVE_INCLUDE_DIR})
# pkg-config can return extra info for static lib linking
# this is probably needed/useful generally, but apply
# to APPLE for now (mostly for homebrew)
if (APPLE AND static AND libarchive_PC_STATIC_LIBRARIES)
message(STATUS "NOTE: libarchive static libs: ${libarchive_PC_STATIC_LIBRARIES}")
# also, APPLE seems to need iconv...maybe linux does too (TBD)
target_link_libraries (archive_lib
INTERFACE iconv ${libarchive_PC_STATIC_LIBRARIES})
endif ()
else ()
## now try searching using the minimal find module that cmake provides
find_package(LibArchive 3.3.3 QUIET)
if (LibArchive_FOUND)
if (static)
# find module doesn't find static libs currently, so we re-search
get_filename_component(_loc ${LibArchive_LIBRARY} DIRECTORY)
find_library(_la_static
NAMES libarchive.a archive_static.lib archive.lib
PATHS ${_loc})
if (_la_static)
set (_la_lib ${_la_static})
else ()
message (WARNING "unable to find libarchive static lib - switching to local build")
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
else ()
set (_la_lib ${LibArchive_LIBRARY})
endif ()
if (NOT local_libarchive)
message (STATUS "Found libarchive using module/config. Using ${_la_lib}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${_la_lib}
IMPORTED_LOCATION_RELEASE
${_la_lib}
INTERFACE_INCLUDE_DIRECTORIES
${LibArchive_INCLUDE_DIRS})
endif ()
else ()
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif()
if (local_libarchive)
set (lib_post "")
if (MSVC)
set (lib_post "_static")
endif ()
ExternalProject_Add (libarchive
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libarchive/libarchive.git
GIT_TAG v3.3.3
CMAKE_ARGS
# passing the compiler seems to be needed for windows CI, sadly
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DENABLE_LZ4=ON
-ULZ4_*
-DLZ4_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
# because we are building a static lib, this lz4 library doesn't
# actually matter since you can't generally link static libs to other static
# libs. The include files are needed, but the library itself is not (until
# we link our application, at which point we use the lz4 we built above).
# nonetheless, we need to provide a library to libarchive else it will
# NOT include lz4 support when configuring
-DLZ4_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-DENABLE_WERROR=OFF
-DENABLE_TAR=OFF
-DENABLE_TAR_SHARED=OFF
-DENABLE_INSTALL=ON
-DENABLE_NETTLE=OFF
-DENABLE_OPENSSL=OFF
-DENABLE_LZO=OFF
-DENABLE_LZMA=OFF
-DENABLE_ZLIB=OFF
-DENABLE_BZip2=OFF
-DENABLE_LIBXML2=OFF
-DENABLE_EXPAT=OFF
-DENABLE_PCREPOSIX=OFF
-DENABLE_LibGCC=OFF
-DENABLE_CNG=OFF
-DENABLE_CPIO=OFF
-DENABLE_CPIO_SHARED=OFF
-DENABLE_CAT=OFF
-DENABLE_CAT_SHARED=OFF
-DENABLE_XATTR=OFF
-DENABLE_ACL=OFF
-DENABLE_ICONV=OFF
-DENABLE_TEST=OFF
-DENABLE_COVERAGE=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target archive_static
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/libarchive/$<CONFIG>/${ep_lib_prefix}archive${lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/libarchive
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (libarchive BINARY_DIR)
ExternalProject_Get_Property (libarchive SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (libarchive)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/libarchive)
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/libarchive
INTERFACE_COMPILE_DEFINITIONS
LIBARCHIVE_STATIC)
endif()
add_dependencies (archive_lib libarchive)
target_link_libraries (archive_lib INTERFACE lz4_lib)
target_link_libraries (ripple_libs INTERFACE archive_lib)
exclude_if_included (libarchive)
exclude_if_included (archive_lib)

View File

@@ -0,0 +1,79 @@
#[===================================================================[
NIH dep: lz4
#]===================================================================]
add_library (lz4_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(lz4)
endif()
if(lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${lz4}
IMPORTED_LOCATION_RELEASE
${lz4}
INTERFACE_INCLUDE_DIRECTORIES
${LZ4_INCLUDE_DIR})
else()
ExternalProject_Add (lz4
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/lz4/lz4.git
GIT_TAG v1.8.2
SOURCE_SUBDIR contrib/cmake_unofficial
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_STATIC_LIBS=ON
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target lz4_static
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}lz4$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}lz4${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}lz4_d${ep_lib_suffix}
)
ExternalProject_Get_Property (lz4 BINARY_DIR)
ExternalProject_Get_Property (lz4 SOURCE_DIR)
file (MAKE_DIRECTORY ${SOURCE_DIR}/lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}lz4_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}lz4${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/lib)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (lz4)
endif ()
endif()
add_dependencies (lz4_lib lz4)
target_link_libraries (ripple_libs INTERFACE lz4_lib)
exclude_if_included (lz4)
exclude_if_included (lz4_lib)

View File

@@ -0,0 +1,47 @@
#[===================================================================[
NIH dep: nudb
NuDB is header-only, thus is an INTERFACE lib in CMake.
TODO: move the library definition into NuDB repo and add
proper targets and export/install
#]===================================================================]
if (is_root_project) # NuDB not needed in the case of xrpl_core inclusion build
add_library (nudb INTERFACE)
if (CMAKE_VERSION VERSION_GREATER_EQUAL 3.11)
FetchContent_Declare(
nudb_src
GIT_REPOSITORY https://github.com/CPPAlliance/NuDB.git
GIT_TAG 2.0.1
)
FetchContent_GetProperties(nudb_src)
if(NOT nudb_src_POPULATED)
message (STATUS "Pausing to download NuDB...")
FetchContent_Populate(nudb_src)
endif()
else ()
ExternalProject_Add (nudb_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/CPPAlliance/NuDB.git
GIT_TAG 2.0.1
CONFIGURE_COMMAND ""
BUILD_COMMAND ""
TEST_COMMAND ""
INSTALL_COMMAND ""
)
ExternalProject_Get_Property (nudb_src SOURCE_DIR)
set (nudb_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${nudb_src_SOURCE_DIR}/include)
add_dependencies (nudb nudb_src)
endif ()
file(TO_CMAKE_PATH "${nudb_src_SOURCE_DIR}" nudb_src_SOURCE_DIR)
# specify as system includes so as to avoid warnings
target_include_directories (nudb SYSTEM INTERFACE ${nudb_src_SOURCE_DIR}/include)
target_link_libraries (nudb
INTERFACE
Boost::thread
Boost::system)
add_library (NIH::nudb ALIAS nudb)
target_link_libraries (ripple_libs INTERFACE NIH::nudb)
endif ()

View File

@@ -0,0 +1,48 @@
#[===================================================================[
NIH dep: openssl
#]===================================================================]
#[===============================================[
OPENSSL_ROOT_DIR is the only variable that
FindOpenSSL honors for locating, so convert any
OPENSSL_ROOT vars to this
#]===============================================]
if (NOT DEFINED OPENSSL_ROOT_DIR)
if (DEFINED ENV{OPENSSL_ROOT})
set (OPENSSL_ROOT_DIR $ENV{OPENSSL_ROOT})
elseif (HOMEBREW)
execute_process (COMMAND ${HOMEBREW} --prefix openssl
OUTPUT_VARIABLE OPENSSL_ROOT_DIR
OUTPUT_STRIP_TRAILING_WHITESPACE)
endif ()
file (TO_CMAKE_PATH "${OPENSSL_ROOT_DIR}" OPENSSL_ROOT_DIR)
endif ()
if (static)
set (OPENSSL_USE_STATIC_LIBS ON)
endif ()
set (OPENSSL_MSVC_STATIC_RT ON)
find_package (OpenSSL 1.0.2 REQUIRED)
target_link_libraries (ripple_libs
INTERFACE
OpenSSL::SSL
OpenSSL::Crypto)
# disable SSLv2...this can also be done when building/configuring OpenSSL
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2)
#[=========================================================[
https://gitlab.kitware.com/cmake/cmake/issues/16885
depending on how openssl is built, it might depend
on zlib. In fact, the openssl find package should
figure this out for us, but it does not currently...
so let's add zlib ourselves to the lib list
TODO: investigate linking to static zlib for static
build option
#]=========================================================]
find_package (ZLIB)
set (has_zlib FALSE)
if (TARGET ZLIB::ZLIB)
set_target_properties(OpenSSL::Crypto PROPERTIES
INTERFACE_LINK_LIBRARIES ZLIB::ZLIB)
set (has_zlib TRUE)
endif ()

View File

@@ -0,0 +1,143 @@
#[===================================================================[
import protobuf (lib and compiler) and create a lib
from our proto message definitions. If the system protobuf
is not found, fallback on EP to download and build a version
from official source.
#]===================================================================]
if (static)
set (Protobuf_USE_STATIC_LIBS ON)
endif ()
find_package (Protobuf 3.8)
if (local_protobuf OR NOT Protobuf_FOUND)
include (GNUInstallDirs)
message (STATUS "using local protobuf build.")
if (WIN32)
# protobuf prepends lib even on windows
set (pbuf_lib_pre "lib")
else ()
set (pbuf_lib_pre ${ep_lib_prefix})
endif ()
# for the external project build of protobuf, we currently ignore the
# static option and always build static libs here. This is consistent
# with our other EP builds. Dynamic libs in an EP would add complexity
# because we'd need to get them into the runtime path, and probably
# install them.
ExternalProject_Add (protobuf_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/protocolbuffers/protobuf.git
GIT_TAG v3.8.0
SOURCE_SUBDIR cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-Dprotobuf_BUILD_TESTS=OFF
-Dprotobuf_BUILD_EXAMPLES=OFF
-Dprotobuf_BUILD_PROTOC_BINARIES=ON
-Dprotobuf_MSVC_STATIC_RUNTIME=ON
-DBUILD_SHARED_LIBS=OFF
-Dprotobuf_BUILD_SHARED_LIBS=OFF
-DCMAKE_DEBUG_POSTFIX=_d
-Dprotobuf_DEBUG_POSTFIX=_d
-Dprotobuf_WITH_ZLIB=$<IF:$<BOOL:${has_zlib}>,ON,OFF>
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}
)
ExternalProject_Get_Property (protobuf_src BINARY_DIR)
ExternalProject_Get_Property (protobuf_src SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (protobuf_src)
endif ()
exclude_if_included (protobuf_src)
if (NOT TARGET protobuf::libprotobuf)
add_library (protobuf::libprotobuf STATIC IMPORTED GLOBAL)
endif ()
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (protobuf::libprotobuf PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotobuf protobuf_src)
exclude_if_included (protobuf::libprotobuf)
if (NOT TARGET protobuf::libprotoc)
add_library (protobuf::libprotoc STATIC IMPORTED GLOBAL)
endif ()
set_target_properties (protobuf::libprotoc PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotoc protobuf_src)
exclude_if_included (protobuf::libprotoc)
if (NOT TARGET protobuf::protoc)
add_executable (protobuf::protoc IMPORTED)
exclude_if_included (protobuf::protoc)
endif ()
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${BINARY_DIR}/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (protobuf::protoc protobuf_src)
else ()
if (NOT TARGET protobuf::protoc)
if (EXISTS "${Protobuf_PROTOC_EXECUTABLE}")
add_executable (protobuf::protoc IMPORTED)
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${Protobuf_PROTOC_EXECUTABLE}")
else ()
message (FATAL_ERROR "Protobuf import failed")
endif ()
endif ()
endif ()
file (MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
set (save_CBD ${CMAKE_CURRENT_BINARY_DIR})
set (CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
protobuf_generate_cpp (
PROTO_SRCS
PROTO_HDRS
src/ripple/proto/ripple.proto)
set (CMAKE_CURRENT_BINARY_DIR ${save_CBD})
add_library (pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
target_include_directories (pbufs PRIVATE src)
target_include_directories (pbufs
SYSTEM PUBLIC ${CMAKE_BINARY_DIR}/proto_gen)
target_link_libraries (pbufs protobuf::libprotobuf)
target_compile_options (pbufs
PUBLIC
$<$<BOOL:${is_xcode}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
add_library (Ripple::pbufs ALIAS pbufs)
target_link_libraries (ripple_libs INTERFACE Ripple::pbufs)
exclude_if_included (pbufs)

View File

@@ -0,0 +1,173 @@
#[===================================================================[
NIH dep: rocksdb
#]===================================================================]
add_library (rocksdb_lib UNKNOWN IMPORTED GLOBAL)
set_target_properties (rocksdb_lib
PROPERTIES INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1)
option (local_rocksdb "use local build of rocksdb." OFF)
if (NOT local_rocksdb)
find_package (RocksDB 6.5 QUIET CONFIG)
if (TARGET RocksDB::rocksdb)
message (STATUS "Found RocksDB using config.")
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_DEBUG)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_RELEASE)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${_rockslib_l})
endif ()
get_target_property (_rockslib_i RocksDB::rocksdb INTERFACE_INCLUDE_DIRECTORIES)
if (_rockslib_i)
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${_rockslib_i})
endif ()
target_link_libraries (ripple_libs INTERFACE RocksDB::rocksdb)
else ()
# using a find module with rocksdb is difficult because
# you have no idea how it was configured (transitive dependencies).
# the code below will generally find rocksdb using the module, but
# will then result in linker errors for static linkage since the
# transitive dependencies are unknown. force local build here for now, but leave the code as
# a placeholder for future investigation.
if (static)
set (local_rocksdb ON CACHE BOOL "" FORCE)
# TBD if there is some way to extract transitive deps..then:
#set (RocksDB_USE_STATIC ON)
else ()
find_package (RocksDB 6.5 MODULE)
if (ROCKSDB_FOUND)
if (RocksDB_LIBRARY_DEBUG)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${RocksDB_LIBRARY_DEBUG})
endif ()
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${RocksDB_INCLUDE_DIRS})
else ()
set (local_rocksdb ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif ()
endif ()
if (local_rocksdb)
message (STATUS "Using local build of RocksDB.")
ExternalProject_Add (rocksdb
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/facebook/rocksdb.git
GIT_TAG v6.5.3
PATCH_COMMAND
# only used by windows build
${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/Builds/CMake/rocks_thirdparty.inc
<SOURCE_DIR>/thirdparty.inc
COMMAND
# fixup their build version file to keep the values
# from changing always
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_SOURCE_DIR}/Builds/CMake/rocksdb_build_version.cc.in
<SOURCE_DIR>/util/build_version.cc.in
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DWITH_JEMALLOC=$<IF:$<BOOL:${jemalloc}>,ON,OFF>
-DWITH_SNAPPY=ON
-DWITH_LZ4=ON
-DWITH_ZLIB=OFF
-DUSE_RTTI=ON
-DWITH_ZSTD=OFF
-DWITH_GFLAGS=OFF
-DWITH_BZ2=OFF
-ULZ4_*
-Ulz4_*
-Dlz4_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dlz4_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-Dlz4_FOUND=ON
-USNAPPY_*
-Usnappy_*
-Dsnappy_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:snappy_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dsnappy_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_RELEASE>>
-Dsnappy_FOUND=ON
-DWITH_MD_LIBRARY=OFF
-DWITH_RUNTIME_DEBUG=$<IF:$<CONFIG:Debug>,ON,OFF>
-DFAIL_ON_WARNINGS=OFF
-DWITH_ASAN=OFF
-DWITH_TSAN=OFF
-DWITH_UBSAN=OFF
-DWITH_NUMA=OFF
-DWITH_TBB=OFF
-DWITH_WINDOWS_UTF8_FILENAMES=OFF
-DWITH_XPRESS=OFF
-DPORTABLE=ON
-DFORCE_SSE42=OFF
-DDISABLE_STALL_NOTIF=OFF
-DOPTDBG=ON
-DROCKSDB_LITE=OFF
-DWITH_FALLOCATE=ON
-DWITH_LIBRADOS=OFF
-DWITH_JNI=OFF
-DROCKSDB_INSTALL_ON_WINDOWS=OFF
-DWITH_TESTS=OFF
-DWITH_TOOLS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -MP /DNDEBUG"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-DNDEBUG"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}rocksdb$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
LIST_SEPARATOR ::
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS snappy_lib lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}rocksdb${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
)
ExternalProject_Get_Property (rocksdb BINARY_DIR)
ExternalProject_Get_Property (rocksdb SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (rocksdb)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
set_target_properties (rocksdb_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}rocksdb${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies (rocksdb_lib rocksdb)
exclude_if_included (rocksdb)
endif ()
target_link_libraries (rocksdb_lib
INTERFACE
snappy_lib
lz4_lib
$<$<BOOL:${MSVC}>:rpcrt4>)
exclude_if_included (rocksdb_lib)
target_link_libraries (ripple_libs INTERFACE rocksdb_lib)

View File

@@ -0,0 +1,58 @@
#[===================================================================[
NIH dep: secp256k1
#]===================================================================]
add_library (secp256k1_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(secp256k1)
endif()
if(secp256k1)
set_target_properties (secp256k1_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${secp256k1}
IMPORTED_LOCATION_RELEASE
${secp256k1}
INTERFACE_INCLUDE_DIRECTORIES
${SECP256K1_INCLUDE_DIR})
add_library (secp256k1 ALIAS secp256k1_lib)
add_library (NIH::secp256k1 ALIAS secp256k1_lib)
else()
set(INSTALL_SECP256K1 true)
add_library (secp256k1 STATIC
src/secp256k1/src/secp256k1.c)
target_compile_definitions (secp256k1
PRIVATE
USE_NUM_NONE
USE_FIELD_10X26
USE_FIELD_INV_BUILTIN
USE_SCALAR_8X32
USE_SCALAR_INV_BUILTIN)
target_include_directories (secp256k1
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
$<INSTALL_INTERFACE:include>
PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src/secp256k1)
target_compile_options (secp256k1
PRIVATE
$<$<BOOL:${MSVC}>:-wd4319>
$<$<NOT:$<BOOL:${MSVC}>>:
-Wno-deprecated-declarations
-Wno-unused-function
>
$<$<BOOL:${is_gcc}>:-Wno-nonnull-compare>)
target_link_libraries (ripple_libs INTERFACE NIH::secp256k1)
#[===========================[
headers installation
#]===========================]
install (
FILES
src/secp256k1/include/secp256k1.h
DESTINATION include/secp256k1/include)
add_library (NIH::secp256k1 ALIAS secp256k1)
endif()

View File

@@ -0,0 +1,77 @@
#[===================================================================[
NIH dep: snappy
#]===================================================================]
add_library (snappy_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(snappy)
endif()
if(snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${snappy}
IMPORTED_LOCATION_RELEASE
${snappy}
INTERFACE_INCLUDE_DIRECTORIES
${SNAPPY_INCLUDE_DIR})
else()
ExternalProject_Add (snappy
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/google/snappy.git
GIT_TAG 1.1.7
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DSNAPPY_BUILD_TESTS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}snappy$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E copy_if_different <BINARY_DIR>/config.h <BINARY_DIR>/snappy-stubs-public.h <SOURCE_DIR>
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}snappy${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}snappy_d${ep_lib_suffix}
)
ExternalProject_Get_Property (snappy BINARY_DIR)
ExternalProject_Get_Property (snappy SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (snappy)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}snappy_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}snappy${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
endif()
add_dependencies (snappy_lib snappy)
target_link_libraries (ripple_libs INTERFACE snappy_lib)
exclude_if_included (snappy)
exclude_if_included (snappy_lib)

View File

@@ -0,0 +1,164 @@
#[===================================================================[
NIH dep: soci
#]===================================================================]
foreach (_comp core empty sqlite3)
add_library ("soci_${_comp}" STATIC IMPORTED GLOBAL)
endforeach ()
if (NOT WIN32)
find_package(soci)
endif()
if (soci)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${soci}
IMPORTED_LOCATION_RELEASE
${soci}
INTERFACE_INCLUDE_DIRECTORIES
${SOCI_INCLUDE_DIR})
endforeach ()
else()
set (soci_lib_pre ${ep_lib_prefix})
set (soci_lib_post "")
if (WIN32)
# for some reason soci on windows still prepends lib (non-standard)
set (soci_lib_pre lib)
# this version in the name might change if/when we change versions of soci
set (soci_lib_post "_4_0")
endif ()
get_target_property (_boost_incs Boost::date_time INTERFACE_INCLUDE_DIRECTORIES)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION)
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_DEBUG)
endif ()
ExternalProject_Add (soci
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/SOCI/soci.git
GIT_TAG 04e1870294918d20761736743bb6136314c42dd5
# We had an issue with soci integer range checking for boost::optional
# and needed to remove the exception that SOCI throws in this case.
# This is *probably* a bug in SOCI, but has never been investigated more
# nor reported to the maintainers.
# This cmake script comments out the lines in question.
# This patch process is likely fragile and should be reviewed carefully
# whenever we update the GIT_TAG above.
PATCH_COMMAND
${CMAKE_COMMAND} -P ${CMAKE_SOURCE_DIR}/Builds/CMake/soci_patch.cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_PREFIX_PATH=${CMAKE_BINARY_DIR}/sqlite3
-DCMAKE_MODULE_PATH=${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake
-DCMAKE_INCLUDE_PATH=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DCMAKE_LIBRARY_PATH=${sqlite_BINARY_DIR}
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DSOCI_CXX_C11=ON
-DSOCI_STATIC=ON
-DSOCI_LIBDIR=lib
-DSOCI_SHARED=OFF
-DSOCI_TESTS=OFF
# hacks to workaround the fact that soci doesn't currently use
# boost imported targets in its cmake. If they switch to
# proper imported targets, this next line can be removed
# (as well as the get_property above that sets _boost_incs)
-DBoost_INCLUDE_DIRS=$<JOIN:${_boost_incs},::>
-DBoost_INCLUDE_DIR=$<JOIN:${_boost_incs},::>
-DBOOST_ROOT=${BOOST_ROOT}
-DWITH_BOOST=ON
-DBoost_FOUND=ON
-DBoost_NO_BOOST_CMAKE=ON
-DBoost_DATE_TIME_FOUND=ON
-DSOCI_HAVE_BOOST=ON
-DSOCI_HAVE_BOOST_DATE_TIME=ON
-DBoost_DATE_TIME_LIBRARY=${_boost_dt}
-DSOCI_DB2=OFF
-DSOCI_FIREBIRD=OFF
-DSOCI_MYSQL=OFF
-DSOCI_ODBC=OFF
-DSOCI_ORACLE=OFF
-DSOCI_POSTGRESQL=OFF
-DSOCI_SQLITE3=ON
-DSQLITE3_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DSQLITE3_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_RELEASE>>
$<$<BOOL:${APPLE}>:-DCMAKE_FIND_FRAMEWORK=LAST>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations"
>
# SEE: https://github.com/SOCI/soci/issues/640
$<$<AND:$<BOOL:${is_gcc}>,$<VERSION_GREATER_EQUAL:${CMAKE_CXX_COMPILER_VERSION},8>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations -Wno-error=format-overflow -Wno-format-overflow -Wno-error=format-truncation"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_core${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_empty${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_sqlite3${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS sqlite
BUILD_BYPRODUCTS
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (soci BINARY_DIR)
ExternalProject_Get_Property (soci SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (soci)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
file (MAKE_DIRECTORY ${BINARY_DIR}/include)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
"${SOURCE_DIR}/include;${BINARY_DIR}/include")
add_dependencies ("soci_${_comp}" soci) # something has to depend on the ExternalProject to trigger it
target_link_libraries (ripple_libs INTERFACE "soci_${_comp}")
if (NOT _comp STREQUAL "core")
target_link_libraries ("soci_${_comp}" INTERFACE soci_core)
endif ()
endforeach ()
endif()
foreach (_comp core empty sqlite3)
exclude_if_included ("soci_${_comp}")
endforeach ()
exclude_if_included (soci)

View File

@@ -0,0 +1,87 @@
#[===================================================================[
NIH dep: sqlite
#]===================================================================]
add_library (sqlite STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(sqlite)
endif()
if(sqlite3)
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${sqlite3}
IMPORTED_LOCATION_RELEASE
${sqlite3}
INTERFACE_INCLUDE_DIRECTORIES
${SQLITE_INCLUDE_DIR})
else()
ExternalProject_Add (sqlite3
PREFIX ${nih_cache_path}
# sqlite doesn't use git, but it provides versioned tarballs
URL https://www.sqlite.org/2018/sqlite-amalgamation-3260000.zip
# ^^^ version is apparent in the URL: 3260000 => 3.26.0
URL_HASH SHA256=de5dcab133aa339a4cf9e97c40aa6062570086d6085d8f9ad7bc6ddf8a52096e
# we wrote a very simple CMake file to build sqlite
# so that's what we copy here so that we can build with
# CMake. sqlite doesn't generally provided a build system
# for the single amalgamation source file.
PATCH_COMMAND
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_SOURCE_DIR}/Builds/CMake/CMake_sqlite3.txt
<SOURCE_DIR>/CMakeLists.txt
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}sqlite3$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}sqlite3${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
)
ExternalProject_Get_Property (sqlite3 BINARY_DIR)
ExternalProject_Get_Property (sqlite3 SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (sqlite3)
endif ()
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}sqlite3${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
add_dependencies (sqlite sqlite3)
exclude_if_included (sqlite3)
endif()
target_link_libraries (sqlite INTERFACE $<$<NOT:$<BOOL:${MSVC}>>:dl>)
target_link_libraries (ripple_libs INTERFACE sqlite)
exclude_if_included (sqlite)
set(sqlite_BINARY_DIR ${BINARY_DIR})

View File

@@ -0,0 +1,49 @@
#[===================================================================[
NIH dep: date
the main library is header-only, thus is an INTERFACE lib in CMake.
NOTE: this has been accepted into c++20 so can likely be replaced
when we update to that standard
#]===================================================================]
find_package (date QUIET)
if (NOT TARGET date::date)
if (CMAKE_VERSION VERSION_GREATER_EQUAL 3.14)
FetchContent_Declare(
hh_date_src
GIT_REPOSITORY https://github.com/HowardHinnant/date.git
GIT_TAG fc4cf092f9674f2670fb9177edcdee870399b829
)
FetchContent_MakeAvailable(hh_date_src)
else ()
ExternalProject_Add (hh_date_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/HowardHinnant/date.git
GIT_TAG fc4cf092f9674f2670fb9177edcdee870399b829
CONFIGURE_COMMAND ""
BUILD_COMMAND ""
TEST_COMMAND ""
INSTALL_COMMAND ""
)
ExternalProject_Get_Property (hh_date_src SOURCE_DIR)
set (hh_date_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${hh_date_src_SOURCE_DIR}/include)
add_library (date_interface INTERFACE)
add_library (date::date ALIAS date_interface)
add_dependencies (date_interface hh_date_src)
file (TO_CMAKE_PATH "${hh_date_src_SOURCE_DIR}" hh_date_src_SOURCE_DIR)
target_include_directories (date_interface
SYSTEM INTERFACE
$<BUILD_INTERFACE:${hh_date_src_SOURCE_DIR}/include>
$<INSTALL_INTERFACE:include>)
install (
FILES
${hh_date_src_SOURCE_DIR}/include/date/date.h
DESTINATION include/date)
install (TARGETS date_interface
EXPORT RippleExports
INCLUDES DESTINATION include)
endif ()
endif ()

View File

@@ -0,0 +1,360 @@
# currently linking to unsecure versions...if we switch, we'll
# need to add ssl as a link dependency to the grpc targets
option (use_secure_grpc "use TLS version of grpc libs." OFF)
if (use_secure_grpc)
set (grpc_suffix "")
else ()
set (grpc_suffix "_unsecure")
endif ()
find_package (gRPC 1.23 CONFIG QUIET)
if (TARGET gRPC::gpr AND NOT local_grpc)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_DEBUG)
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION)
endif ()
message (STATUS "Found cmake config for gRPC. Using ${_grpc_l}.")
else ()
find_package (PkgConfig QUIET)
if (PKG_CONFIG_FOUND)
pkg_check_modules (grpc QUIET "grpc${grpc_suffix}>=1.25" "grpc++${grpc_suffix}" gpr)
endif ()
if (grpc_FOUND)
message (STATUS "Found gRPC using pkg-config. Using ${grpc_gpr_PREFIX}.")
endif ()
add_executable (gRPC::grpc_cpp_plugin IMPORTED)
exclude_if_included (gRPC::grpc_cpp_plugin)
if (grpc_FOUND AND NOT local_grpc)
# use installed grpc (via pkg-config)
macro (add_imported_grpc libname_)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}${libname_}${CMAKE_STATIC_LIBRARY_SUFFIX}")
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}${libname_}${CMAKE_SHARED_LIBRARY_SUFFIX}")
endif()
find_library(_found_${libname_}
NAMES ${_search}
HINTS ${grpc_LIBRARY_DIRS})
if (_found_${libname_})
message (STATUS "importing ${libname_} as ${_found_${libname_}}")
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find ${_search}")
endif ()
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES IMPORTED_LOCATION ${_found_${libname_}})
if (grpc_INCLUDE_DIRS)
set_target_properties ("gRPC::${libname_}" PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${grpc_INCLUDE_DIRS})
endif ()
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_gpr_PREFIX}/bin/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
pkg_check_modules (cares QUIET libcares)
if (cares_FOUND)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}cares${CMAKE_STATIC_LIBRARY_SUFFIX}")
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}cares${CMAKE_SHARED_LIBRARY_SUFFIX}")
endif()
find_library(_cares
NAMES ${_search}
HINTS ${cares_LIBRARY_DIRS})
if (NOT _cares)
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
add_library (c-ares::cares STATIC IMPORTED GLOBAL)
set_target_properties (c-ares::cares PROPERTIES IMPORTED_LOCATION ${_cares})
if (cares_INCLUDE_DIRS)
set_target_properties (c-ares::cares PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${cares_INCLUDE_DIRS})
endif ()
exclude_if_included (c-ares::cares)
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
else ()
#[===========================[
c-ares (grpc requires)
#]===========================]
ExternalProject_Add (c-ares_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/c-ares/c-ares.git
GIT_TAG cares-1_15_0
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DCARES_SHARED=OFF
-DCARES_STATIC=ON
-DCARES_STATIC_PIC=ON
-DCARES_INSTALL=ON
-DCARES_MSVC_STATIC_RUNTIME=ON
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
)
exclude_if_included (c-ares_src)
ExternalProject_Get_Property (c-ares_src BINARY_DIR)
set (cares_binary_dir "${BINARY_DIR}")
add_library (c-ares::cares STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (c-ares::cares PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (c-ares::cares c-ares_src)
exclude_if_included (c-ares::cares)
if (NOT has_zlib)
#[===========================[
zlib (grpc requires)
#]===========================]
if (MSVC)
set (zlib_debug_postfix "d") # zlib cmake sets this internally for MSVC, so we really don't have a choice
set (zlib_base "zlibstatic")
else ()
set (zlib_debug_postfix "_d")
set (zlib_base "z")
endif ()
ExternalProject_Add (zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.11
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=${zlib_debug_postfix}
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
)
exclude_if_included (zlib_src)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_binary_dir "${BINARY_DIR}")
add_library (ZLIB::ZLIB STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (ZLIB::ZLIB PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (ZLIB::ZLIB zlib_src)
exclude_if_included (ZLIB::ZLIB)
endif ()
#[===========================[
grpc
#]===========================]
ExternalProject_Add (grpc_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/grpc/grpc.git
GIT_TAG v1.25.0
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DgRPC_BUILD_TESTS=OFF
-DgRPC_BUILD_CSHARP_EXT=OFF
-DgRPC_MSVC_STATIC_RUNTIME=ON
-DgRPC_INSTALL=OFF
-DgRPC_CARES_PROVIDER=package
-Dc-ares_DIR=${cares_binary_dir}/_installed_/lib/cmake/c-ares
-DgRPC_SSL_PROVIDER=package
-DOPENSSL_ROOT_DIR=${OPENSSL_ROOT_DIR}
-DgRPC_PROTOBUF_PROVIDER=package
-DProtobuf_USE_STATIC_LIBS=$<IF:$<AND:$<BOOL:${Protobuf_FOUND}>,$<NOT:$<BOOL:${static}>>>,OFF,ON>
-DProtobuf_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:protobuf::libprotobuf,INTERFACE_INCLUDE_DIRECTORIES>,:_:>
-DProtobuf_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_EXECUTABLE=$<TARGET_PROPERTY:protobuf::protoc,IMPORTED_LOCATION>
-DgRPC_ZLIB_PROVIDER=package
$<$<NOT:$<BOOL:${has_zlib}>>:-DZLIB_ROOT=${zlib_binary_dir}/_installed_>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
$<$<VERSION_GREATER_EQUAL:${CMAKE_VERSION},3.12>:--parallel ${ep_procs}>
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc++${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}address_sorting$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}gpr$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
<BINARY_DIR>
>
LIST_SEPARATOR :_:
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS c-ares_src
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr_d${ep_lib_suffix}
<BINARY_DIR>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
)
if (TARGET protobuf_src)
ExternalProject_Add_StepDependencies(grpc_src build protobuf_src)
endif ()
exclude_if_included (grpc_src)
ExternalProject_Get_Property (grpc_src BINARY_DIR)
ExternalProject_Get_Property (grpc_src SOURCE_DIR)
set (grpc_binary_dir "${BINARY_DIR}")
set (grpc_source_dir "${SOURCE_DIR}")
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (grpc_src)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
macro (add_imported_grpc libname_)
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${grpc_binary_dir}/${ep_lib_prefix}${libname_}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${grpc_binary_dir}/${ep_lib_prefix}${libname_}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${grpc_source_dir}/include)
add_dependencies ("gRPC::${libname_}" grpc_src)
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_binary_dir}/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (gRPC::grpc_cpp_plugin grpc_src)
endif ()
add_imported_grpc (gpr)
add_imported_grpc ("grpc${grpc_suffix}")
add_imported_grpc ("grpc++${grpc_suffix}")
add_imported_grpc (address_sorting)
target_link_libraries ("gRPC::grpc${grpc_suffix}" INTERFACE c-ares::cares gRPC::gpr gRPC::address_sorting ZLIB::ZLIB)
target_link_libraries ("gRPC::grpc++${grpc_suffix}" INTERFACE "gRPC::grpc${grpc_suffix}" gRPC::gpr)
endif ()
#[=================================[
generate protobuf sources for
grpc defs and bundle into a
static lib
#]=================================]
set (GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
file (MAKE_DIRECTORY ${GRPC_GEN_DIR})
set (GRPC_PROTO_SRCS)
set (GRPC_PROTO_HDRS)
set (GRPC_PROTO_ROOT "${CMAKE_SOURCE_DIR}/src/ripple/proto/org")
file(GLOB_RECURSE GRPC_DEFINITION_FILES LIST_DIRECTORIES false "${GRPC_PROTO_ROOT}/*.proto")
foreach(file ${GRPC_DEFINITION_FILES})
get_filename_component(_abs_file ${file} ABSOLUTE)
get_filename_component(_abs_dir ${_abs_file} DIRECTORY)
get_filename_component(_basename ${file} NAME_WE)
get_filename_component(_proto_inc ${GRPC_PROTO_ROOT} DIRECTORY) # updir one level
file(RELATIVE_PATH _rel_root_file ${_proto_inc} ${_abs_file})
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
set (src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
set (src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
set (hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
set (hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
add_custom_command(
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
COMMAND protobuf::protoc
ARGS --grpc_out=${GRPC_GEN_DIR}
--cpp_out=${GRPC_GEN_DIR}
--plugin=protoc-gen-grpc=$<TARGET_FILE:gRPC::grpc_cpp_plugin>
-I ${_proto_inc} -I ${_rel_dir}
${_abs_file}
DEPENDS ${_abs_file} protobuf::protoc gRPC::grpc_cpp_plugin
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Running gRPC C++ protocol buffer compiler on ${file}"
VERBATIM)
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES GENERATED TRUE)
list(APPEND GRPC_PROTO_SRCS ${src_1} ${src_2})
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
endforeach()
add_library (grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
#target_include_directories (grpc_pbufs PRIVATE src)
target_include_directories (grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
target_link_libraries (grpc_pbufs protobuf::libprotobuf "gRPC::grpc++${grpc_suffix}")
target_compile_options (grpc_pbufs
PRIVATE
$<$<BOOL:${MSVC}>:-wd4065>
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
PUBLIC
$<$<BOOL:${MSVC}>:-wd4996>
$<$<BOOL:${is_xcode}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
add_library (Ripple::grpc_pbufs ALIAS grpc_pbufs)
target_link_libraries (ripple_libs INTERFACE Ripple::grpc_pbufs)
exclude_if_included (grpc_pbufs)

View File

@@ -5,11 +5,13 @@
IN_FILE variable.
#]=========================================================]
file (READ ${IN_FILE} contents)
## only print files that actually have some text in them
if (contents MATCHES "[a-z0-9A-Z]+")
execute_process(
COMMAND
${CMAKE_COMMAND} -E echo "${contents}")
if (EXISTS ${IN_FILE})
file (READ ${IN_FILE} contents)
## only print files that actually have some text in them
if (contents MATCHES "[a-z0-9A-Z]+")
execute_process(
COMMAND
${CMAKE_COMMAND} -E echo "${contents}")
endif ()
endif ()

View File

@@ -1,16 +1,15 @@
set (THIRDPARTY_LIBS "")
if(WITH_SNAPPY)
find_package(snappy REQUIRED)
add_definitions(-DSNAPPY)
include_directories(${SNAPPY_INCLUDE_DIR})
list(APPEND THIRDPARTY_LIBS ${SNAPPY_LIBRARIES})
include_directories(${snappy_INCLUDE_DIRS})
set (THIRDPARTY_LIBS ${THIRDPARTY_LIBS} ${snappy_LIBRARIES})
endif()
if(WITH_LZ4)
find_package(lz4 REQUIRED)
add_definitions(-DLZ4)
include_directories(${LZ4_INCLUDE_DIR})
list(APPEND THIRDPARTY_LIBS ${LZ4_LIBRARIES})
include_directories(${lz4_INCLUDE_DIRS})
set (THIRDPARTY_LIBS ${THIRDPARTY_LIBS} ${lz4_LIBRARIES})
endif()

View File

@@ -10,4 +10,13 @@ foreach (line_ ${sourcecode})
endforeach ()
file (RENAME include/soci/unsigned-types.h include/soci/unsigned-types.h.orig)
file (RENAME include/soci/unsigned-types.h.patched include/soci/unsigned-types.h)
# also fix Boost.cmake so that it just returns when we override the Boost_FOUND var
file (APPEND cmake/dependencies/Boost.cmake.patched "if (Boost_FOUND)\n")
file (APPEND cmake/dependencies/Boost.cmake.patched " return ()\n")
file (APPEND cmake/dependencies/Boost.cmake.patched "endif ()\n")
file (STRINGS cmake/dependencies/Boost.cmake sourcecode)
foreach (line_ ${sourcecode})
file (APPEND cmake/dependencies/Boost.cmake.patched "${line_}\n")
endforeach ()
file (RENAME cmake/dependencies/Boost.cmake.patched cmake/dependencies/Boost.cmake)

View File

@@ -272,9 +272,17 @@ def run_cmake(directory, cmake_dir, args):
args += ( '-DCMAKE_BUILD_TYPE=Debug', )
if re.search('release', cmake_dir):
args += ( '-DCMAKE_BUILD_TYPE=Release', )
if re.search('gcc', cmake_dir):
m = re.search('gcc(-[^.]*)', cmake_dir)
if m:
args += ( '-DCMAKE_C_COMPILER=' + m.group(0),
'-DCMAKE_CXX_COMPILER=g++' + m.group(1), )
elif re.search('gcc', cmake_dir):
args += ( '-DCMAKE_C_COMPILER=gcc', '-DCMAKE_CXX_COMPILER=g++', )
if re.search('clang', cmake_dir):
m = re.search('clang(-[^.]*)', cmake_dir)
if m:
args += ( '-DCMAKE_C_COMPILER=' + m.group(0),
'-DCMAKE_CXX_COMPILER=clang++' + m.group(1), )
elif re.search('clang', cmake_dir):
args += ( '-DCMAKE_C_COMPILER=clang', '-DCMAKE_CXX_COMPILER=clang++', )
args += ( os.path.join('..', '..', '..'), )

View File

@@ -8,28 +8,32 @@ COPY centos-builder/centos_setup.sh /tmp/
COPY shared/build_deps.sh /tmp/
COPY shared/install_cmake.sh /tmp/
COPY centos-builder/extras.sh /tmp/
COPY shared/install_boost.sh /tmp/
RUN chmod +x /tmp/centos_setup.sh && \
chmod +x /tmp/build_deps.sh && \
chmod +x /tmp/install_boost.sh && \
chmod +x /tmp/install_cmake.sh && \
chmod +x /tmp/extras.sh
RUN /tmp/centos_setup.sh
RUN /tmp/install_cmake.sh
RUN /tmp/install_cmake.sh 3.16.1 /opt/local/cmake-3.16
RUN ln -s /opt/local/cmake-3.16 /opt/local/cmake
ENV PATH="/opt/local/cmake/bin:$PATH"
RUN source scl_source enable devtoolset-6 python27 && \
# also install min supported cmake for testing
RUN if [ "${CI_USE}" = true ] ; then /tmp/install_cmake.sh 3.9.0 /opt/local/cmake-3.9; fi
RUN source scl_source enable devtoolset-7 python27 && \
/tmp/build_deps.sh
ENV BOOST_ROOT="/opt/local/boost/_INSTALLED_"
ENV PLANTUML_JAR="/opt/plantuml/plantuml.jar"
ENV BOOST_ROOT="/opt/local/boost"
ENV OPENSSL_ROOT="/opt/local/openssl"
ENV GDB_ROOT="/opt/local/gdb"
RUN source scl_source enable devtoolset-6 python27 && \
RUN source scl_source enable devtoolset-7 python27 && \
/tmp/extras.sh
# prep files for package building
RUN mkdir -m 777 -p /opt/rippled_bld/pkg
WORKDIR /opt/rippled_bld/pkg
COPY packaging/rpm/rippled.spec ./
COPY shared/update_sources.sh ./
RUN mkdir -m 777 ./rpmbuild
RUN mkdir -m 777 ./rpmbuild/{BUILD,RPMS,SOURCES,SPECS,SRPMS}

View File

@@ -7,16 +7,12 @@ yum -y upgrade
yum -y update
yum -y install epel-release centos-release-scl
yum -y install \
wget curl time gcc-c++ time yum-utils \
wget curl time gcc-c++ time yum-utils autoconf automake pkgconfig libtool \
libstdc++-static rpm-build gnupg which make cmake \
devtoolset-4 devtoolset-4-gdb devtoolset-4-libasan-devel devtoolset-4-libtsan-devel devtoolset-4-libubsan-devel \
devtoolset-6 devtoolset-6-gdb devtoolset-6-libasan-devel devtoolset-6-libtsan-devel devtoolset-6-libubsan-devel \
devtoolset-7 devtoolset-7-gdb devtoolset-7-libasan-devel devtoolset-7-libtsan-devel devtoolset-7-libubsan-devel \
llvm-toolset-7 llvm-toolset-7-runtime llvm-toolset-7-build llvm-toolset-7-clang \
llvm-toolset-7-clang-analyzer llvm-toolset-7-clang-devel llvm-toolset-7-clang-libs \
llvm-toolset-7-clang-tools-extra llvm-toolset-7-compiler-rt llvm-toolset-7-lldb \
llvm-toolset-7-lldb-devel llvm-toolset-7-python-lldb \
flex flex-devel bison bison-devel \
devtoolset-8 devtoolset-8-gdb devtoolset-8-binutils devtoolset-8-libstdc++-devel \
devtoolset-8-libasan-devel devtoolset-8-libtsan-devel devtoolset-8-libubsan-devel devtoolset-8-liblsan-devel \
flex flex-devel bison bison-devel parallel \
ncurses ncurses-devel ncurses-libs graphviz graphviz-devel \
lzip p7zip bzip2 bzip2-devel lzma-sdk lzma-sdk-devel xz-devel \
zlib zlib-devel zlib-static texinfo openssl openssl-static \
@@ -26,10 +22,16 @@ yum -y install \
python-devel python27-python-devel rh-python35-python-devel \
python27 rh-python35 \
ninja-build git svn \
protobuf protobuf-static protobuf-c-devel \
protobuf-compiler protobuf-devel \
swig ccache perl-Digest-MD5 python2-pip
swig perl-Digest-MD5 python2-pip
# TODO need permanent link
yum -y install ftp://ftp.pbone.net/mirror/archive.fedoraproject.org/fedora-secondary/updates/26/i386/Packages/p/python2-six-1.10.0-9.fc26.noarch.rpm
if [ "${CI_USE}" = true ] ; then
# TODO need permanent link
yum -y install ftp://ftp.pbone.net/mirror/archive.fedoraproject.org/fedora-secondary/updates/26/i386/Packages/p/python2-six-1.10.0-9.fc26.noarch.rpm
yum -y install \
llvm-toolset-7 llvm-toolset-7-runtime llvm-toolset-7-build llvm-toolset-7-clang \
llvm-toolset-7-clang-analyzer llvm-toolset-7-clang-devel llvm-toolset-7-clang-libs \
llvm-toolset-7-clang-tools-extra llvm-toolset-7-compiler-rt llvm-toolset-7-lldb \
llvm-toolset-7-lldb-devel llvm-toolset-7-python-lldb
fi

View File

@@ -3,50 +3,31 @@ set -ex
if [ "${CI_USE}" = true ] ; then
cd /tmp
wget https://ftp.gnu.org/gnu/gdb/gdb-8.2.tar.xz
tar xf gdb-8.2.tar.xz
cd gdb-8.2
./configure CFLAGS="-w -O2" CXXFLAGS="-std=gnu++11 -g -O2 -w" --prefix=/opt/local/gdb-8.2
wget https://ftp.gnu.org/gnu/gdb/gdb-8.3.1.tar.xz
tar xf gdb-8.3.1.tar.xz
cd gdb-8.3
./configure CFLAGS="-w -O2" CXXFLAGS="-std=gnu++11 -g -O2 -w" --prefix=/opt/local/gdb-8.3
make -j$(nproc)
make install
ln -s /opt/local/gdb-8.2 /opt/local/gdb
ln -s /opt/local/gdb-8.3 /opt/local/gdb
cd ..
rm -f gdb-8.2.tar.xz
rm -rf gdb-8.2
rm -f gdb-8.3.tar.xz
rm -rf gdb-8.3
# clang from source
RELEASE=tags/RELEASE_701/final
INSTALL=/opt/llvm-7.0.1/
mkdir -p /tmp/clang-src
cd /tmp/clang-src
TOPDIR=`pwd`
svn co -q http://llvm.org/svn/llvm-project/llvm/${RELEASE} llvm
cd ${TOPDIR}/llvm/tools
svn co -q http://llvm.org/svn/llvm-project/cfe/${RELEASE} clang
cd ${TOPDIR}/llvm/tools/clang/tools
svn co -q http://llvm.org/svn/llvm-project/clang-tools-extra/${RELEASE} extra
cd ${TOPDIR}/llvm/tools
svn co -q http://llvm.org/svn/llvm-project/lld/${RELEASE} lld
cd ${TOPDIR}/llvm/tools
svn co -q http://llvm.org/svn/llvm-project/polly/${RELEASE} polly
cd ${TOPDIR}/llvm/projects
svn co -q http://llvm.org/svn/llvm-project/compiler-rt/${RELEASE} compiler-rt
cd ${TOPDIR}/llvm/projects
svn co -q http://llvm.org/svn/llvm-project/openmp/${RELEASE} openmp
cd ${TOPDIR}/llvm/projects
svn co -q http://llvm.org/svn/llvm-project/libcxx/${RELEASE} libcxx
svn co -q http://llvm.org/svn/llvm-project/libcxxabi/${RELEASE} libcxxabi
cd ${TOPDIR}/llvm/projects
## config/build
cd ${TOPDIR}
cd /tmp
git clone https://github.com/llvm/llvm-project.git
cd llvm-project
git checkout llvmorg-9.0.0
INSTALL=/opt/llvm-9/
mkdir mybuilddir && cd mybuilddir
# TODO figure out necessary options
cmake ../llvm -G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DLLVM_ENABLE_PROJECTS='clang;clang-tools-extra;libcxx;libcxxabi;lldb;compiler-rt;lld;polly' \
-DCMAKE_INSTALL_PREFIX=${INSTALL} \
-DLLVM_LIBDIR_SUFFIX=64 \
-DLLVM_ENABLE_EH=ON \
-DLLVM_ENABLE_RTTI=ON
-DLLVM_LIBDIR_SUFFIX=64
cmake --build . --parallel --target install
cd /tmp
rm -rf clang-src
rm -rf llvm-project
fi

View File

@@ -9,24 +9,15 @@ else
echo "invalid package type"
exit 1
fi
if docker pull "${ARTIFACTORY_HUB}/${container_name}:${CI_COMMIT_SHA}"; then
echo "${pkgtype} container for ${CI_COMMIT_SHA} already exists" \
"- skipping container build!"
exit 0
else
echo "no existing ${pkgtype} container for this branch - searching history."
for CID_PREV in $(git log --pretty=%H -n30) ; do
if docker pull "${ARTIFACTORY_HUB}/${container_name}:${CID_PREV}"; then
echo "found container for previous commit ${CID_PREV}" \
"- using as cache."
docker tag \
"${ARTIFACTORY_HUB}/${container_name}:${CID_PREV}" \
"${container_name}:${CID_PREV}"
CMAKE_EXTRA="-D${pkgtype}_cache_from=${container_name}:${CID_PREV}"
break
fi
done
if docker pull "${ARTIFACTORY_HUB}/${container_name}:latest_${CI_COMMIT_REF_SLUG}"; then
echo "found container for latest - using as cache."
docker tag \
"${ARTIFACTORY_HUB}/${container_name}:latest_${CI_COMMIT_REF_SLUG}" \
"${container_name}:latest_${CI_COMMIT_REF_SLUG}"
CMAKE_EXTRA="-D${pkgtype}_cache_from=${container_name}:latest_${CI_COMMIT_REF_SLUG}"
fi
cmake --version
test -d build && rm -rf build
mkdir -p build/container && cd build/container
@@ -34,8 +25,4 @@ eval time \
cmake -Dpackages_only=ON -DCMAKE_VERBOSE_MAKEFILE=ON ${CMAKE_EXTRA} \
-G Ninja ../..
time cmake --build . --target "${pkgtype}_container" -- -v
docker tag \
"${container_name}:${CI_COMMIT_SHA}" \
"${ARTIFACTORY_HUB}/${container_name}:${CI_COMMIT_SHA}"
time docker push "${ARTIFACTORY_HUB}/${container_name}:${CI_COMMIT_SHA}"

View File

@@ -2,22 +2,27 @@
set -ex
pkgtype=$1
if [ "${pkgtype}" = "rpm" ] ; then
container_name="${RPM_CONTAINER_NAME}"
container_name="${RPM_CONTAINER_FULLNAME}"
container_tag="${RPM_CONTAINER_TAG}"
elif [ "${pkgtype}" = "dpkg" ] ; then
container_name="${DPKG_CONTAINER_NAME}"
container_name="${DPKG_CONTAINER_FULLNAME}"
container_tag="${DPKG_CONTAINER_TAG}"
else
echo "invalid package type"
exit 1
fi
time docker pull "${ARTIFACTORY_HUB}/${container_name}:${CI_COMMIT_SHA}"
time docker pull "${ARTIFACTORY_HUB}/${container_name}"
docker tag \
"${ARTIFACTORY_HUB}/${container_name}:${CI_COMMIT_SHA}" \
"${container_name}:${CI_COMMIT_SHA}"
"${ARTIFACTORY_HUB}/${container_name}" \
"${container_name}"
docker images
test -d build && rm -rf build
mkdir -p build/${pkgtype} && cd build/${pkgtype}
time cmake \
-Dpackages_only=ON -Dhave_package_container=ON -DCMAKE_VERBOSE_MAKEFILE=ON \
-Dpackages_only=ON \
-Dcontainer_label="${container_tag}" \
-Dhave_package_container=ON \
-DCMAKE_VERBOSE_MAKEFILE=ON \
-G Ninja ../..
time cmake --build . --target ${pkgtype} -- -v

View File

@@ -8,8 +8,15 @@
# NOTE: these are sensible defaults for Ripple pipelines. These
# can be overridden by project or group variables as needed.
variables:
# these containers are built manually using the rippled
# cmake build (container targets) and tagged/pushed so they
# can be used here
RPM_CONTAINER_TAG: "2020-02-10"
RPM_CONTAINER_NAME: "rippled-rpm-builder"
RPM_CONTAINER_FULLNAME: "${RPM_CONTAINER_NAME}:${RPM_CONTAINER_TAG}"
DPKG_CONTAINER_TAG: "2020-02-10"
DPKG_CONTAINER_NAME: "rippled-dpkg-builder"
DPKG_CONTAINER_FULLNAME: "${DPKG_CONTAINER_NAME}:${DPKG_CONTAINER_TAG}"
ARTIFACTORY_HOST: "artifactory.ops.ripple.com"
ARTIFACTORY_HUB: "${ARTIFACTORY_HOST}:6555"
GIT_SIGN_PUBKEYS_URL: "https://gitlab.ops.ripple.com/snippets/11/raw"
@@ -17,13 +24,10 @@ variables:
# also need to define this variable ONLY for the primary
# build/publish pipeline on the mainline repo:
# IS_PRIMARY_REPO = "true"
# and if you want to pause for manual approval before
# pushing to pkg repos:
# REQUIRE_APPROVAL = "true"
stages:
- build_containers
- build_packages
- sign_packages
- smoketest
- verify_sig
- tag_images
@@ -33,6 +37,7 @@ stages:
- push_to_prod
- verify_from_prod
- get_final_hashes
- build_containers
.dind_template: &dind_param
before_script:
@@ -42,7 +47,9 @@ stages:
image:
name: docker:latest
services:
- docker:dind
# workaround for TLS issues - consider going back
# back to unversioned `dind` when issues are resolved
- docker:18-dind
tags:
- docker-4xlarge
@@ -53,16 +60,6 @@ stages:
variables:
- $IS_PRIMARY_REPO == "true"
.only_primary_manual_template: &only_primary_manual
only:
refs:
- /^(master|release|develop)$/
variables:
- $IS_PRIMARY_REPO == "true"
- $REQUIRE_APPROVAL == "true"
when: manual
allow_failure: false
.smoketest_local_template: &run_local_smoketest
tags:
- xlarge
@@ -75,35 +72,6 @@ stages:
script:
- . ./Builds/containers/gitlab-ci/smoketest.sh repo
#########################################################################
## ##
## stage: build_containers ##
## ##
## build containers from docker definitions. These containers are ##
## subsequently used to build the rpm and deb packages. ##
## ##
#########################################################################
build_centos_container:
stage: build_containers
<<: *dind_param
cache:
key: containers
paths:
- .nih_c
script:
- . ./Builds/containers/gitlab-ci/build_container.sh rpm
build_ubuntu_container:
stage: build_containers
<<: *dind_param
cache:
key: containers
paths:
- .nih_c
script:
- . ./Builds/containers/gitlab-ci/build_container.sh dpkg
#########################################################################
## ##
## stage: build_packages ##
@@ -114,34 +82,89 @@ build_ubuntu_container:
rpm_build:
stage: build_packages
dependencies:
- build_centos_container
<<: *dind_param
artifacts:
paths:
- build/rpm/packages/
cache:
key: rpm
paths:
- .nih_c/pkgbuild
script:
- . ./Builds/containers/gitlab-ci/build_package.sh rpm
dpkg_build:
stage: build_packages
dependencies:
- build_ubuntu_container
<<: *dind_param
artifacts:
paths:
- build/dpkg/packages/
cache:
key: dpkg
paths:
- .nih_c/pkgbuild
script:
- . ./Builds/containers/gitlab-ci/build_package.sh dpkg
#########################################################################
## ##
## stage: sign_packages ##
## ##
## build packages using containers from previous stage. ##
## ##
#########################################################################
rpm_sign:
stage: sign_packages
dependencies:
- rpm_build
image:
name: centos:7
<<: *only_primary
before_script:
- |
# Make sure GnuPG is installed
yum -y install gnupg rpm-sign
# checking GPG signing support
if [ -n "$GPG_KEY_B64" ]; then
echo "$GPG_KEY_B64"| base64 -d | gpg --batch --no-tty --allow-secret-key-import --import -
unset GPG_KEY_B64
export GPG_PASSPHRASE=$(echo $GPG_KEY_PASS_B64 | base64 -di)
unset GPG_KEY_PASS_B64
export GPG_KEYID=$(gpg --with-colon --list-secret-keys | head -n1 | cut -d : -f 5)
else
echo -e "\033[0;31m****** GPG signing disabled ******\033[0m"
exit 1
fi
artifacts:
paths:
- build/rpm/packages/
script:
- ls -alh build/rpm/packages
- . ./Builds/containers/gitlab-ci/sign_package.sh rpm
dpkg_sign:
stage: sign_packages
dependencies:
- dpkg_build
image:
name: ubuntu:18.04
<<: *only_primary
before_script:
- |
# make sure we have GnuPG
apt update
apt install -y gpg dpkg-sig
# checking GPG signing support
if [ -n "$GPG_KEY_B64" ]; then
echo "$GPG_KEY_B64"| base64 -d | gpg --batch --no-tty --allow-secret-key-import --import -
unset GPG_KEY_B64
export GPG_PASSPHRASE=$(echo $GPG_KEY_PASS_B64 | base64 -di)
unset GPG_KEY_PASS_B64
export GPG_KEYID=$(gpg --with-colon --list-secret-keys | head -n1 | cut -d : -f 5)
else
echo -e "\033[0;31m****** GPG signing disabled ******\033[0m"
exit 1
fi
artifacts:
paths:
- build/dpkg/packages/
script:
- ls -alh build/dpkg/packages
- . ./Builds/containers/gitlab-ci/sign_package.sh dpkg
#########################################################################
## ##
## stage: smoketest ##
@@ -154,6 +177,7 @@ centos_7_smoketest:
stage: smoketest
dependencies:
- rpm_build
- rpm_sign
image:
name: centos:7
<<: *run_local_smoketest
@@ -162,6 +186,7 @@ fedora_29_smoketest:
stage: smoketest
dependencies:
- rpm_build
- rpm_sign
image:
name: fedora:29
<<: *run_local_smoketest
@@ -170,6 +195,7 @@ fedora_28_smoketest:
stage: smoketest
dependencies:
- rpm_build
- rpm_sign
image:
name: fedora:28
<<: *run_local_smoketest
@@ -178,6 +204,7 @@ fedora_27_smoketest:
stage: smoketest
dependencies:
- rpm_build
- rpm_sign
image:
name: fedora:27
<<: *run_local_smoketest
@@ -189,6 +216,7 @@ ubuntu_19_smoketest:
stage: smoketest
dependencies:
- dpkg_build
- dpkg_sign
image:
name: ubuntu:19.04
<<: *run_local_smoketest
@@ -197,6 +225,7 @@ ubuntu_18_smoketest:
stage: smoketest
dependencies:
- dpkg_build
- dpkg_sign
image:
name: ubuntu:18.04
<<: *run_local_smoketest
@@ -205,6 +234,7 @@ ubuntu_16_smoketest:
stage: smoketest
dependencies:
- dpkg_build
- dpkg_sign
image:
name: ubuntu:16.04
<<: *run_local_smoketest
@@ -213,6 +243,7 @@ debian_9_smoketest:
stage: smoketest
dependencies:
- dpkg_build
- dpkg_sign
image:
name: debian:9
<<: *run_local_smoketest
@@ -253,12 +284,14 @@ tag_bld_images:
image:
name: docker:latest
services:
- docker:dind
# workaround for TLS issues - consider going back
# back to unversioned `dind` when issues are resolved
- docker:18-dind
tags:
- docker-large
dependencies:
- rpm_build
- dpkg_build
- rpm_sign
- dpkg_sign
<<: *only_primary
script:
- . ./Builds/containers/gitlab-ci/tag_docker_image.sh
@@ -283,8 +316,8 @@ push_test:
paths:
- files.info
dependencies:
- rpm_build
- dpkg_build
- rpm_sign
- dpkg_sign
<<: *only_primary
script:
- . ./Builds/containers/gitlab-ci/push_to_artifactory.sh "PUT" "."
@@ -305,7 +338,7 @@ centos_7_verify_repo_test:
image:
name: centos:7
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -316,7 +349,7 @@ fedora_29_verify_repo_test:
image:
name: fedora:29
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -327,7 +360,7 @@ fedora_28_verify_repo_test:
image:
name: fedora:28
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -338,7 +371,7 @@ fedora_27_verify_repo_test:
image:
name: fedora:27
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -350,7 +383,7 @@ ubuntu_19_verify_repo_test:
image:
name: ubuntu:19.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -362,7 +395,7 @@ ubuntu_18_verify_repo_test:
image:
name: ubuntu:18.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -374,7 +407,7 @@ ubuntu_16_verify_repo_test:
image:
name: ubuntu:16.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -386,7 +419,7 @@ debian_9_verify_repo_test:
image:
name: debian:9
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -396,17 +429,18 @@ debian_9_verify_repo_test:
## ##
## wait for manual approval before proceeding to next stage ##
## which pushes to prod repo. ##
## ONLY RUNS FOR PRIMARY BRANCHES/REPO and when ##
## REQUIRE_APPROVAL is set to true. ##
## ONLY RUNS FOR PRIMARY BRANCHES/REPO ##
## ##
#########################################################################
wait_before_push_prod:
stage: wait_approval_prod
image:
name: alpine:latest
<<: *only_primary_manual
<<: *only_primary
script:
- echo "proceeding to next stage"
when: manual
allow_failure: false
#########################################################################
## ##
@@ -428,8 +462,8 @@ push_prod:
paths:
- files.info
dependencies:
- rpm_build
- dpkg_build
- rpm_sign
- dpkg_sign
<<: *only_primary
script:
- . ./Builds/containers/gitlab-ci/push_to_artifactory.sh "PUT" "."
@@ -450,7 +484,7 @@ centos_7_verify_repo_prod:
image:
name: centos:7
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -461,7 +495,7 @@ fedora_29_verify_repo_prod:
image:
name: fedora:29
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -472,7 +506,7 @@ fedora_28_verify_repo_prod:
image:
name: fedora:28
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -483,7 +517,7 @@ fedora_27_verify_repo_prod:
image:
name: fedora:27
dependencies:
- rpm_build
- rpm_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -495,7 +529,7 @@ ubuntu_19_verify_repo_prod:
image:
name: ubuntu:19.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -507,7 +541,7 @@ ubuntu_18_verify_repo_prod:
image:
name: ubuntu:18.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -519,7 +553,7 @@ ubuntu_16_verify_repo_prod:
image:
name: ubuntu:16.04
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -531,7 +565,7 @@ debian_9_verify_repo_prod:
image:
name: debian:9
dependencies:
- dpkg_build
- dpkg_sign
<<: *only_primary
<<: *run_repo_smoketest
@@ -555,9 +589,34 @@ get_prod_hashes:
paths:
- files.info
dependencies:
- rpm_build
- dpkg_build
- rpm_sign
- dpkg_sign
<<: *only_primary
script:
- . ./Builds/containers/gitlab-ci/push_to_artifactory.sh "GET" ".checksums"
#########################################################################
## ##
## stage: build_containers ##
## ##
## build containers from docker definitions. These containers are NOT ##
## used for the package build. This step is only used to ensure that ##
## the package build targets and files are still working properly. ##
## ##
#########################################################################
build_centos_container:
stage: build_containers
<<: *dind_param
script:
- . ./Builds/containers/gitlab-ci/build_container.sh rpm
allow_failure: true
build_ubuntu_container:
stage: build_containers
<<: *dind_param
script:
- . ./Builds/containers/gitlab-ci/build_container.sh dpkg
allow_failure: true

View File

@@ -31,10 +31,10 @@ for deb in ${RIPPLED_PKG} ${RIPPLED_DEV_PKG} ${RIPPLED_DBG_PKG} ; do
echo "\"${deb}\"": | tee -a "${TOPDIR}/files.info"
ca="${CURLARGS}"
if [ "${action}" = "PUT" ] ; then
url="https://${ARTIFACTORY_HOST}/artifactory/${DEB_REPO}/pool/${deb}${DEB_MATRIX}"
url="https://${ARTIFACTORY_HOST}/artifactory/${DEB_REPO}/pool/${COMPONENT}/${deb}${DEB_MATRIX}"
ca="${ca} -T${deb}"
elif [ "${action}" = "GET" ] ; then
url="https://${ARTIFACTORY_HOST}/artifactory/api/storage/${DEB_REPO}/pool/${deb}"
url="https://${ARTIFACTORY_HOST}/artifactory/api/storage/${DEB_REPO}/pool/${COMPONENT}/${deb}"
fi
echo "file info request url --> ${url}"
eval "curl ${ca} \"${url}\"" | jq -M "${filter}" | tee -a "${TOPDIR}/files.info"

View File

@@ -0,0 +1,38 @@
#!/usr/bin/env bash
set -eo pipefail
sign_dpkg() {
if [ -n "${GPG_KEYID}" ]; then
dpkg-sig \
-g "--no-tty --digest-algo 'sha512' --passphrase '${GPG_PASSPHRASE}' --pinentry-mode=loopback" \
-k "${GPG_KEYID}" \
--sign builder \
"build/dpkg/packages/*.deb"
fi
}
sign_rpm() {
if [ -n "${GPG_KEYID}" ] ; then
find build/rpm/packages -name "*.rpm" -exec bash -c '
echo "yes" | setsid rpm \
--define "_gpg_name ${GPG_KEYID}" \
--define "_signature gpg" \
--define "__gpg_check_password_cmd /bin/true" \
--define "__gpg_sign_cmd %{__gpg} gpg --batch --no-armor --digest-algo 'sha512' --passphrase '${GPG_PASSPHRASE}' --no-secmem-warning -u '%{_gpg_name}' --sign --detach-sign --output %{__signature_filename} %{__plaintext_filename}" \
--addsign '{} \;
fi
}
case "${1}" in
dpkg)
sign_dpkg
;;
rpm)
sign_rpm
;;
*)
echo "Usage: ${0} (dpkg|rpm)"
;;
esac

View File

@@ -4,14 +4,19 @@ docker login -u rippled \
-p ${ARTIFACTORY_DEPLOY_KEY_RIPPLED} "${ARTIFACTORY_HUB}"
# this gives us rippled_version :
source build/rpm/packages/build_vars
docker pull "${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}:${CI_COMMIT_SHA}"
docker pull "${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}:${CI_COMMIT_SHA}"
docker tag \
"${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}:${CI_COMMIT_SHA}" \
"${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}:${rippled_version}_${CI_COMMIT_REF_SLUG}"
docker tag \
"${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}:${CI_COMMIT_SHA}" \
"${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}:${rippled_version}_${CI_COMMIT_REF_SLUG}"
docker push "${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}"
docker push "${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}"
docker pull "${ARTIFACTORY_HUB}/${RPM_CONTAINER_FULLNAME}"
docker pull "${ARTIFACTORY_HUB}/${DPKG_CONTAINER_FULLNAME}"
# tag/push two labels...one using the current rippled version and one just using "latest"
for label in ${rippled_version} latest ; do
docker tag \
"${ARTIFACTORY_HUB}/${RPM_CONTAINER_FULLNAME}" \
"${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}:${label}_${CI_COMMIT_REF_SLUG}"
docker push \
"${ARTIFACTORY_HUB}/${RPM_CONTAINER_NAME}:${label}_${CI_COMMIT_REF_SLUG}"
docker tag \
"${ARTIFACTORY_HUB}/${DPKG_CONTAINER_FULLNAME}" \
"${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}:${label}_${CI_COMMIT_REF_SLUG}"
docker push \
"${ARTIFACTORY_HUB}/${DPKG_CONTAINER_NAME}:${label}_${CI_COMMIT_REF_SLUG}"
done

View File

@@ -8,8 +8,8 @@ if git verify-commit HEAD; then
echo "git commit signature check passed"
else
echo "git commit signature check failed"
git log -n 5 --color
--pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an> [%G?]%Creset'
git log -n 5 --color \
--pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an> [%G?]%Creset' \
--abbrev-commit
exit 1
fi

View File

@@ -1,6 +1,11 @@
#!/usr/bin/env bash
set -ex
# make sure pkg source files are up to date with repo
cd /opt/rippled_bld/pkg
cp -fpru rippled/Builds/containers/packaging/dpkg/debian/. debian/
cp -fpu rippled/Builds/containers/shared/rippled.service debian/
cp -fpu rippled/Builds/containers/shared/update_sources.sh .
source update_sources.sh
# Build the dpkg
@@ -11,7 +16,7 @@ RIPPLED_DPKG_VERSION=$(echo "${RIPPLED_VERSION}" | sed 's!-!~!g')
# version here (hardcoded to 1). Does it ever need to change?
RIPPLED_DPKG_FULL_VERSION="${RIPPLED_DPKG_VERSION}-1"
cd rippled
cd /opt/rippled_bld/pkg/rippled
if [[ -n $(git status --porcelain) ]]; then
git status
error "Unstaged changes in this repo - please commit first"
@@ -40,6 +45,8 @@ CHANGELOG
# PATH must be preserved for our more modern cmake in /opt/local
# TODO : consider allowing lintian to run in future ?
export DH_BUILD_DDEBS=1
export CC=gcc-8
export CXX=g++-8
debuild --no-lintian --preserve-envvar PATH --preserve-env -us -uc
rc=$?; if [[ $rc != 0 ]]; then
error "error building dpkg"

View File

@@ -2,7 +2,7 @@ Source: rippled
Section: misc
Priority: extra
Maintainer: Ripple Labs Inc. <support@ripple.com>
Build-Depends: cmake, debhelper (>=9), libprotobuf-dev, libssl-dev, zlib1g-dev, dh-systemd, ninja-build
Build-Depends: cmake, debhelper (>=9), zlib1g-dev, dh-systemd, ninja-build
Standards-Version: 3.9.7
Homepage: http://ripple.com/

View File

@@ -1,6 +1,7 @@
opt/ripple/bin/rippled
opt/ripple/bin/validator-keys
opt/ripple/bin/update-rippled.sh
opt/ripple/bin/getRippledInfo
opt/ripple/etc/rippled.cfg
opt/ripple/etc/validators.txt
opt/ripple/etc/update-rippled-cron

View File

@@ -19,28 +19,24 @@ override_dh_auto_configure:
cmake .. -G Ninja \
-DCMAKE_INSTALL_PREFIX=/opt/ripple \
-DCMAKE_BUILD_TYPE=Release \
-Dstatic=true \
-Dlocal_protobuf=ON \
-DCMAKE_UNITY_BUILD_BATCH_SIZE=10 \
-Dstatic=ON \
-Dvalidator_keys=ON \
-DCMAKE_VERBOSE_MAKEFILE=ON
override_dh_auto_build:
cd bld && \
cmake --build . --parallel -- -v
cmake --build . --target rippled --target validator-keys --parallel -- -v
override_dh_auto_install:
cd bld && DESTDIR=../debian/tmp cmake --build . --target install -- -v
rm -rf bld_vl
mkdir -p bld_vl
cd bld_vl && \
cmake ../../validator-keys-tool -G Ninja \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_PREFIX_PATH=${PWD}/debian/tmp/opt/ripple/ \
-Dstatic=true \
-DCMAKE_VERBOSE_MAKEFILE=ON && \
cmake --build . --parallel -- -v
install -D bld_vl/validator-keys debian/tmp/opt/ripple/bin/validator-keys
install -D bld/validator-keys/validator-keys debian/tmp/opt/ripple/bin/validator-keys
install -D Builds/containers/shared/update-rippled.sh debian/tmp/opt/ripple/bin/update-rippled.sh
install -D bin/getRippledInfo debian/tmp/opt/ripple/bin/getRippledInfo
install -D Builds/containers/shared/update-rippled-cron debian/tmp/opt/ripple/etc/update-rippled-cron
install -D Builds/containers/shared/rippled-logrotate debian/tmp/etc/logrotate.d/rippled
rm -rf debian/tmp/opt/ripple/lib64/cmake/date
rm -rf bld
rm -rf bld_vl

View File

@@ -1,6 +1,9 @@
#!/usr/bin/env bash
set -ex
cd /opt/rippled_bld/pkg
cp -fpu rippled/Builds/containers/packaging/rpm/rippled.spec .
cp -fpu rippled/Builds/containers/shared/update_sources.sh .
source update_sources.sh
# Build the rpm
@@ -25,16 +28,16 @@ if [[ $RPM_PATCH ]]; then
export RPM_PATCH
fi
cd rippled
cd /opt/rippled_bld/pkg/rippled
if [[ -n $(git status --porcelain) ]]; then
git status
error "Unstaged changes in this repo - please commit first"
fi
git archive --format tar.gz --prefix rippled/ -o ../rpmbuild/SOURCES/rippled.tar.gz HEAD
# TODO include validator-keys sources
cd ..
tar -zc --exclude-vcs -f ./rpmbuild/SOURCES/validator-keys.tar.gz validator-keys-tool/
source /opt/rh/devtoolset-7/enable
source /opt/rh/devtoolset-8/enable
rpmbuild --define "_topdir ${PWD}/rpmbuild" -ba rippled.spec
rc=$?; if [[ $rc != 0 ]]; then

View File

@@ -11,9 +11,8 @@ Summary: rippled daemon
License: MIT
URL: http://ripple.com/
Source0: rippled.tar.gz
Source1: validator-keys.tar.gz
BuildRequires: protobuf-static openssl-static cmake zlib-static ninja-build
BuildRequires: cmake zlib-static ninja-build
%description
rippled
@@ -27,26 +26,14 @@ Requires: openssl-static, zlib-static
core library for development of standalone applications that sign transactions.
%prep
%setup -c -n rippled -a 1
%setup -c -n rippled
%build
cd rippled
mkdir -p bld.release
cd bld.release
cmake .. -G Ninja -DCMAKE_INSTALL_PREFIX=%{_prefix} -DCMAKE_BUILD_TYPE=Release -Dstatic=true -DCMAKE_VERBOSE_MAKEFILE=ON -Dlocal_protobuf=ON
# build VK
cd ../../validator-keys-tool
mkdir -p bld.release
cd bld.release
# Install a copy of the rippled artifacts into a local VK build dir so that it
# can use them to build against (VK needs xrpl_core lib to build). We install
# into a local build dir instead of buildroot because we want VK to have
# relative paths embedded in debug info, otherwise check-buildroot (rpmbuild)
# will complain
mkdir xrpl_dir
DESTDIR="%{_builddir}/validator-keys-tool/bld.release/xrpl_dir" cmake --build ../../rippled/bld.release --target install -- -v
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=%{_builddir}/validator-keys-tool/bld.release/xrpl_dir/opt/ripple -Dstatic=true -DCMAKE_VERBOSE_MAKEFILE=ON
cmake --build . --parallel -- -v
cmake .. -G Ninja -DCMAKE_INSTALL_PREFIX=%{_prefix} -DCMAKE_BUILD_TYPE=Release -DCMAKE_UNITY_BUILD_BATCH_SIZE=10 -Dstatic=true -DCMAKE_VERBOSE_MAKEFILE=ON -Dvalidator_keys=ON
cmake --build . --parallel --target rippled --target validator-keys -- -v
%pre
test -e /etc/pki/tls || { mkdir -p /etc/pki; ln -s /usr/lib/ssl /etc/pki/tls; }
@@ -54,15 +41,17 @@ test -e /etc/pki/tls || { mkdir -p /etc/pki; ln -s /usr/lib/ssl /etc/pki/tls; }
%install
rm -rf $RPM_BUILD_ROOT
DESTDIR=$RPM_BUILD_ROOT cmake --build rippled/bld.release --target install -- -v
rm -rf ${RPM_BUILD_ROOT}/%{_prefix}/lib64/cmake/date
install -d ${RPM_BUILD_ROOT}/etc/opt/ripple
install -d ${RPM_BUILD_ROOT}/usr/local/bin
ln -s %{_prefix}/etc/rippled.cfg ${RPM_BUILD_ROOT}/etc/opt/ripple/rippled.cfg
ln -s %{_prefix}/etc/validators.txt ${RPM_BUILD_ROOT}/etc/opt/ripple/validators.txt
ln -s %{_prefix}/bin/rippled ${RPM_BUILD_ROOT}/usr/local/bin/rippled
install -D validator-keys-tool/bld.release/validator-keys ${RPM_BUILD_ROOT}%{_bindir}/validator-keys
install -D rippled/bld.release/validator-keys/validator-keys ${RPM_BUILD_ROOT}%{_bindir}/validator-keys
install -D ./rippled/Builds/containers/shared/rippled.service ${RPM_BUILD_ROOT}/usr/lib/systemd/system/rippled.service
install -D ./rippled/Builds/containers/packaging/rpm/50-rippled.preset ${RPM_BUILD_ROOT}/usr/lib/systemd/system-preset/50-rippled.preset
install -D ./rippled/Builds/containers/shared/update-rippled.sh ${RPM_BUILD_ROOT}%{_bindir}/update-rippled.sh
install -D ./rippled/bin/getRippledInfo ${RPM_BUILD_ROOT}%{_bindir}/getRippledInfo
install -D ./rippled/Builds/containers/shared/update-rippled-cron ${RPM_BUILD_ROOT}%{_prefix}/etc/update-rippled-cron
install -D ./rippled/Builds/containers/shared/rippled-logrotate ${RPM_BUILD_ROOT}/etc/logrotate.d/rippled
install -d $RPM_BUILD_ROOT/var/log/rippled
@@ -91,6 +80,7 @@ chown -R root:$GROUP_NAME %{_prefix}/etc/update-rippled-cron
%{_bindir}/rippled
/usr/local/bin/rippled
%{_bindir}/update-rippled.sh
%{_bindir}/getRippledInfo
%{_prefix}/etc/update-rippled-cron
%{_bindir}/validator-keys
%config(noreplace) %{_prefix}/etc/rippled.cfg
@@ -109,6 +99,9 @@ chown -R root:$GROUP_NAME %{_prefix}/etc/update-rippled-cron
%{_prefix}/lib/cmake/ripple
%changelog
* Wed Aug 28 2019 Mike Ellery <mellery451@gmail.com>
- Switch to subproject build for validator-keys
* Wed May 15 2019 Mike Ellery <mellery451@gmail.com>
- Make validator-keys use local rippled build for core lib

View File

@@ -6,20 +6,15 @@ function build_boost()
local boost_ver=$1
local do_link=$2
local boost_path=$(echo "${boost_ver}" | sed -e 's!\.!_!g')
cd /tmp
wget https://dl.bintray.com/boostorg/release/${boost_ver}/source/boost_${boost_path}.tar.bz2
mkdir -p /opt/local
cd /opt/local
tar xf /tmp/boost_${boost_path}.tar.bz2
BOOST_ROOT=/opt/local/boost_${boost_path}
BOOST_URL="https://dl.bintray.com/boostorg/release/${boost_ver}/source/boost_${boost_path}.tar.bz2"
BOOST_BUILD_ALL=true
. /tmp/install_boost.sh
if [ "$do_link" = true ] ; then
ln -s ./boost_${boost_path} boost
fi
cd boost_${boost_path}
./bootstrap.sh
./b2 -j$(nproc)
./b2 stage
cd ..
rm -f /tmp/boost_${boost_path}.tar.bz2
}
build_boost "1.70.0" true
@@ -27,7 +22,7 @@ build_boost "1.70.0" true
# installed in opt, so won't be used
# unless specified by OPENSSL_ROOT_DIR
cd /tmp
OPENSSL_VER=1.1.1
OPENSSL_VER=1.1.1d
wget https://www.openssl.org/source/openssl-${OPENSSL_VER}.tar.gz
tar xf openssl-${OPENSSL_VER}.tar.gz
cd openssl-${OPENSSL_VER}
@@ -39,40 +34,115 @@ make install
cd ..
rm -f openssl-${OPENSSL_VER}.tar.gz
rm -rf openssl-${OPENSSL_VER}
LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/opt/local/openssl/lib /opt/local/openssl/bin/openssl version -a
LD_LIBRARY_PATH=${LD_LIBRARY_PATH:-}:/opt/local/openssl/lib /opt/local/openssl/bin/openssl version -a
cd /tmp
wget https://libarchive.org/downloads/libarchive-3.4.1.tar.gz
tar xzf libarchive-3.4.1.tar.gz
cd libarchive-3.4.1
mkdir _bld && cd _bld
cmake -DCMAKE_BUILD_TYPE=Release ..
make -j$(nproc)
make install
cd ../..
rm -f libarchive-3.4.1.tar.gz
rm -rf libarchive-3.4.1
cd /tmp
wget https://github.com/protocolbuffers/protobuf/releases/download/v3.10.1/protobuf-all-3.10.1.tar.gz
tar xf protobuf-all-3.10.1.tar.gz
cd protobuf-3.10.1
./autogen.sh
./configure
make -j$(nproc)
make install
ldconfig
cd ..
rm -f protobuf-all-3.10.1.tar.gz
rm -rf protobuf-3.10.1
cd /tmp
wget https://c-ares.haxx.se/download/c-ares-1.15.0.tar.gz
tar xf c-ares-1.15.0.tar.gz
cd c-ares-1.15.0
mkdir _bld && cd _bld
cmake \
-DHAVE_LIBNSL=OFF \
-DCMAKE_BUILD_TYPE=Release \
-DCARES_STATIC=ON \
-DCARES_SHARED=OFF \
-DCARES_INSTALL=ON \
-DCARES_STATIC_PIC=ON \
-DCARES_BUILD_TOOLS=OFF \
-DCARES_BUILD_TESTS=OFF \
-DCARES_BUILD_CONTAINER_TESTS=OFF \
..
make -j$(nproc)
make install
cd ../..
rm -f c-ares-1.15.0.tar.gz
rm -rf c-ares-1.15.0
cd /tmp
wget https://github.com/grpc/grpc/archive/v1.25.0.tar.gz
tar xf v1.25.0.tar.gz
cd grpc-1.25.0
mkdir _bld && cd _bld
cmake \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=OFF \
-DgRPC_ZLIB_PROVIDER=package \
-DgRPC_CARES_PROVIDER=package \
-DgRPC_SSL_PROVIDER=package \
-DgRPC_PROTOBUF_PROVIDER=package \
-DProtobuf_USE_STATIC_LIBS=ON \
..
make -j$(nproc)
make install
cd ../..
rm -f xf v1.25.0.tar.gz
rm -rf grpc-1.25.0
if [ "${CI_USE}" = true ] ; then
build_boost "1.71.0" false
cd /tmp
wget https://github.com/doxygen/doxygen/archive/Release_1_8_14.tar.gz
tar xf Release_1_8_14.tar.gz
cd doxygen-Release_1_8_14
wget https://github.com/doxygen/doxygen/archive/Release_1_8_16.tar.gz
tar xf Release_1_8_16.tar.gz
cd doxygen-Release_1_8_16
mkdir build
cd build
cmake -G "Unix Makefiles" ..
make -j$(nproc)
make install
cd ../..
rm -f Release_1_8_14.tar.gz
rm -rf doxygen-Release_1_8_14
rm -f Release_1_8_16.tar.gz
rm -rf doxygen-Release_1_8_16
mkdir -p /opt/plantuml
wget -O /opt/plantuml/plantuml.jar https://downloads.sourceforge.net/project/plantuml/plantuml.jar
cd /tmp
wget https://github.com/linux-test-project/lcov/releases/download/v1.13/lcov-1.13.tar.gz
tar xfz lcov-1.13.tar.gz
cd lcov-1.13
wget https://github.com/linux-test-project/lcov/releases/download/v1.14/lcov-1.14.tar.gz
tar xfz lcov-1.14.tar.gz
cd lcov-1.14
make install PREFIX=/usr/local
cd ..
rm -r lcov-1.13 lcov-1.13.tar.gz
rm -r lcov-1.14 lcov-1.14.tar.gz
cd /tmp
wget https://github.com/ccache/ccache/releases/download/v3.7.6/ccache-3.7.6.tar.gz
tar xf ccache-3.7.6.tar.gz
cd ccache-3.7.6
./configure --prefix=/usr/local
make
make install
cd ..
rm -f ccache-3.7.6.tar.gz
rm -rf ccache-3.7.6
pip install requests
pip install https://github.com/codecov/codecov-python/archive/master.zip
set +e
mkdir -p /opt/local/nih_cache
mkdir -p /opt/jenkins
set -e
fi

View File

@@ -0,0 +1,82 @@
#!/usr/bin/env bash
# Assumptions:
# 1) BOOST_ROOT and BOOST_URL are already defined,
# and contain valid values.
# 2) The last namepart of BOOST_ROOT matches the
# folder name internal to boost's .tar.gz
# When testing you can force a boost build by clearing travis caches:
# https://travis-ci.org/ripple/rippled/caches
set -exu
odir=$(pwd)
: ${BOOST_TOOLSET:=msvc-14.1}
if [[ -d "$BOOST_ROOT/lib" || -d "${BOOST_ROOT}/stage/lib" ]] ; then
echo "Using cached boost at $BOOST_ROOT"
exit
fi
#fetch/unpack:
fn=$(basename -- "$BOOST_URL")
ext="${fn##*.}"
wget --quiet $BOOST_URL -O /tmp/boost.tar.${ext}
cd $(dirname $BOOST_ROOT)
rm -fr ${BOOST_ROOT}
mkdir ${BOOST_ROOT}
tar xf /tmp/boost.tar.${ext} -C ${BOOST_ROOT} --strip-components 1
cd $BOOST_ROOT
BLDARGS=()
if [[ ${BOOST_BUILD_ALL:-false} == "true" ]]; then
# we never need boost-python...so even for ALL
# option we can skip it
BLDARGS+=(--without-python)
else
BLDARGS+=(--with-chrono)
BLDARGS+=(--with-context)
BLDARGS+=(--with-coroutine)
BLDARGS+=(--with-date_time)
BLDARGS+=(--with-filesystem)
BLDARGS+=(--with-program_options)
BLDARGS+=(--with-regex)
BLDARGS+=(--with-serialization)
BLDARGS+=(--with-system)
BLDARGS+=(--with-atomic)
BLDARGS+=(--with-thread)
fi
BLDARGS+=(-j$((2*${NUM_PROCESSORS:-2})))
BLDARGS+=(--prefix=${BOOST_ROOT}/_INSTALLED_)
BLDARGS+=(-d0) # suppress messages/output
if [[ -z ${COMSPEC:-} ]]; then
if [[ "$(uname)" == "Darwin" ]] ; then
BLDARGS+=(cxxflags="-std=c++14 -fvisibility=default")
else
BLDARGS+=(cxxflags="-std=c++14")
BLDARGS+=(runtime-link="static,shared")
fi
BLDARGS+=(--layout=tagged)
./bootstrap.sh
./b2 "${BLDARGS[@]}" stage
./b2 "${BLDARGS[@]}" install
else
BLDARGS+=(runtime-link="static,shared")
BLDARGS+=(--layout=versioned)
BLDARGS+=(--toolset="${BOOST_TOOLSET}")
BLDARGS+=(address-model=64)
BLDARGS+=(architecture=x86)
BLDARGS+=(link=static)
BLDARGS+=(threading=multi)
cmd /E:ON /D /S /C"bootstrap.bat"
./b2.exe "${BLDARGS[@]}" stage
./b2.exe "${BLDARGS[@]}" install
fi
if [[ ${CI:-false} == "true" ]]; then
# save some disk space...these are mostly
# obj files and don't need to be kept in CI contexts
rm -rf bin.v2
fi
cd $odir

View File

@@ -1,16 +1,34 @@
#!/usr/bin/env bash
set -ex
set -e
cd /tmp
CM_INSTALLER=cmake-3.13.2-Linux-x86_64.sh
CM_VER_DIR=/opt/local/cmake-3.13
wget https://cmake.org/files/v3.13/$CM_INSTALLER
chmod a+x $CM_INSTALLER
mkdir -p $CM_VER_DIR
ln -s $CM_VER_DIR /opt/local/cmake
./$CM_INSTALLER --prefix=$CM_VER_DIR --exclude-subdir
rm -f /tmp/$CM_INSTALLER
IFS=. read cm_maj cm_min cm_rel <<<"$1"
: ${cm_rel:-0}
CMAKE_ROOT=${2:-"${HOME}/cmake"}
export PATH="/opt/local/cmake/bin:${PATH}"
function cmake_version ()
{
if [[ -d ${CMAKE_ROOT} ]] ; then
local perms=$(test $(uname) = "Linux" && echo "/111" || echo "+111")
local installed=$(find ${CMAKE_ROOT} -perm ${perms} -type f -name cmake)
if [[ "${installed}" != "" ]] ; then
echo "$(${installed} --version | head -1)"
fi
fi
}
installed=$(cmake_version)
if [[ "${installed}" != "" && ${installed} =~ ${cm_maj}.${cm_min}.${cm_rel} ]] ; then
echo "cmake already installed: ${installed}"
exit
fi
pkgname="cmake-${cm_maj}.${cm_min}.${cm_rel}-$(uname)-x86_64.tar.gz"
tmppkg="/tmp/cmake.tar.gz"
wget --quiet https://cmake.org/files/v${cm_maj}.${cm_min}/${pkgname} -O ${tmppkg}
mkdir -p ${CMAKE_ROOT}
cd ${CMAKE_ROOT}
tar --strip-components 1 -xf ${tmppkg}
rm -f ${tmppkg}
echo "installed: $(cmake_version)"

View File

@@ -2,6 +2,12 @@
# auto-update script for rippled daemon
# Check for sudo/root permissions
if [[ $(id -u) -ne 0 ]] ; then
echo "This update script must be run as root or sudo"
exit 1
fi
LOCKDIR=/tmp/rippleupdate.lock
UPDATELOG=/var/log/rippled/update.log

View File

@@ -8,13 +8,6 @@ function error {
cd /opt/rippled_bld/pkg/rippled
export RIPPLED_VERSION=$(egrep -i -o "\b(0|[1-9][0-9]*)\.(0|[1-9][0-9]*)\.(0|[1-9][0-9]*)(-[0-9a-z\-]+(\.[0-9a-z\-]+)*)?(\+[0-9a-z\-]+(\.[0-9a-z\-]+)*)?\b" src/ripple/protocol/impl/BuildInfo.cpp)
cd ..
git clone https://github.com/ripple/validator-keys-tool.git
cd validator-keys-tool
git checkout origin/master
git submodule update --init --recursive
cd ..
: ${PKG_OUTDIR:=/opt/rippled_bld/pkg/out}
export PKG_OUTDIR
if [ ! -d ${PKG_OUTDIR} ]; then

View File

@@ -1,32 +1,36 @@
ARG DIST_TAG=16.04
ARG GIT_COMMIT=unknown
FROM ubuntu:$DIST_TAG
ARG GIT_COMMIT=unknown
ARG CI_USE=false
LABEL git-commit=$GIT_COMMIT
# install/setup prerequisites:
COPY ubuntu-builder/ubuntu_setup.sh /tmp/
COPY shared/build_deps.sh /tmp/
COPY shared/install_cmake.sh /tmp/
COPY shared/install_boost.sh /tmp/
RUN chmod +x /tmp/ubuntu_setup.sh && \
chmod +x /tmp/build_deps.sh && \
chmod +x /tmp/install_boost.sh && \
chmod +x /tmp/install_cmake.sh
RUN /tmp/ubuntu_setup.sh
RUN /tmp/install_cmake.sh
RUN /tmp/install_cmake.sh 3.16.1 /opt/local/cmake-3.16
RUN ln -s /opt/local/cmake-3.16 /opt/local/cmake
ENV PATH="/opt/local/cmake/bin:$PATH"
# also install min supported cmake for testing
RUN if [ "${CI_USE}" = true ] ; then /tmp/install_cmake.sh 3.9.0 /opt/local/cmake-3.9; fi
RUN /tmp/build_deps.sh
ENV PLANTUML_JAR="/opt/plantuml/plantuml.jar"
ENV BOOST_ROOT="/opt/local/boost"
ENV BOOST_ROOT="/opt/local/boost/_INSTALLED_"
ENV OPENSSL_ROOT="/opt/local/openssl"
# prep files for package building
RUN mkdir -m 777 -p /opt/rippled_bld/pkg
RUN mkdir -m 777 -p /opt/rippled_bld/pkg/debian
RUN update-alternatives --set gcc /usr/bin/gcc-8
WORKDIR /opt/rippled_bld/pkg
COPY packaging/dpkg/debian /opt/rippled_bld/pkg/debian/
COPY shared/update_sources.sh ./
COPY shared/rippled.service /opt/rippled_bld/pkg/debian/
COPY packaging/dpkg/build_dpkg.sh ./
CMD ./build_dpkg.sh

View File

@@ -24,29 +24,27 @@ apt-get -y upgrade
if [[ ${VERSION_ID} =~ ^18\. ]] ; then
apt-add-repository -y multiverse
apt-add-repository -y universe
elif [[ ${VERSION_ID} =~ ^16\. ]] ; then
add-apt-repository -y ppa:ubuntu-toolchain-r/test
fi
add-apt-repository -y ppa:ubuntu-toolchain-r/test
apt-get -y clean
apt-get -y update
apt-get -y --fix-missing install \
make cmake ninja-build ccache \
protobuf-compiler libprotobuf-dev openssl libssl-dev \
apt-get -y --fix-missing install \
make cmake ninja-build autoconf automake libtool pkg-config libtool \
openssl libssl-dev \
liblzma-dev libbz2-dev zlib1g-dev \
libjemalloc-dev \
python-pip \
gdb gdbserver \
libstdc++6 \
flex bison \
flex bison parallel \
libicu-dev texinfo \
java-common javacc \
gcc-7 g++-7 \
gcc-8 g++-8 \
dpkg-dev debhelper devscripts fakeroot \
debmake git-buildpackage dh-make gitpkg debsums gnupg \
dh-buildinfo dh-make dh-systemd
apt-get -y install gcc-7 g++-7
update-alternatives --install \
/usr/bin/gcc gcc /usr/bin/gcc-7 40 \
--slave /usr/bin/g++ g++ /usr/bin/g++-7 \
@@ -56,6 +54,8 @@ update-alternatives --install \
--slave /usr/bin/gcov gcov /usr/bin/gcov-7 \
--slave /usr/bin/gcov-tool gcov-tool /usr/bin/gcov-dump-7 \
--slave /usr/bin/gcov-dump gcov-dump /usr/bin/gcov-tool-7
apt-get -y install gcc-8 g++-8
update-alternatives --install \
/usr/bin/gcc gcc /usr/bin/gcc-8 20 \
--slave /usr/bin/g++ g++ /usr/bin/g++-8 \
@@ -71,6 +71,30 @@ update-alternatives --install /usr/bin/cpp cpp /usr/bin/cpp-7 40
update-alternatives --install /usr/bin/cpp cpp /usr/bin/cpp-8 20
update-alternatives --auto cpp
if [ "${CI_USE}" = true ] ; then
apt-get -y install gcc-6 g++-6
update-alternatives --install \
/usr/bin/gcc gcc /usr/bin/gcc-6 10 \
--slave /usr/bin/g++ g++ /usr/bin/g++-6 \
--slave /usr/bin/gcc-ar gcc-ar /usr/bin/gcc-ar-6 \
--slave /usr/bin/gcc-nm gcc-nm /usr/bin/gcc-nm-6 \
--slave /usr/bin/gcc-ranlib gcc-ranlib /usr/bin/gcc-ranlib-6 \
--slave /usr/bin/gcov gcov /usr/bin/gcov-6 \
--slave /usr/bin/gcov-tool gcov-tool /usr/bin/gcov-dump-6 \
--slave /usr/bin/gcov-dump gcov-dump /usr/bin/gcov-tool-6
apt-get -y install gcc-9 g++-9
update-alternatives --install \
/usr/bin/gcc gcc /usr/bin/gcc-9 15 \
--slave /usr/bin/g++ g++ /usr/bin/g++-9 \
--slave /usr/bin/gcc-ar gcc-ar /usr/bin/gcc-ar-9 \
--slave /usr/bin/gcc-nm gcc-nm /usr/bin/gcc-nm-9 \
--slave /usr/bin/gcc-ranlib gcc-ranlib /usr/bin/gcc-ranlib-9 \
--slave /usr/bin/gcov gcov /usr/bin/gcov-9 \
--slave /usr/bin/gcov-tool gcov-tool /usr/bin/gcov-dump-9 \
--slave /usr/bin/gcov-dump gcov-dump /usr/bin/gcov-tool-9
fi
if [[ ${VERSION_ID} =~ ^18\. ]] ; then
apt-get -y install binutils
elif [[ ${VERSION_ID} =~ ^16\. ]] ; then
@@ -82,39 +106,83 @@ if [[ ${VERSION_ID} =~ ^18\. ]] ; then
cat << EOF > /etc/apt/sources.list.d/llvm.list
deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic main
deb-src http://apt.llvm.org/bionic/ llvm-toolchain-bionic main
deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic-6.0 main
deb-src http://apt.llvm.org/bionic/ llvm-toolchain-bionic-6.0 main
deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic-7 main
deb-src http://apt.llvm.org/bionic/ llvm-toolchain-bionic-7 main
deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic-8 main
deb-src http://apt.llvm.org/bionic/ llvm-toolchain-bionic-8 main
deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic-9 main
deb-src http://apt.llvm.org/bionic/ llvm-toolchain-bionic-9 main
EOF
elif [[ ${VERSION_ID} =~ ^16\. ]] ; then
cat << EOF > /etc/apt/sources.list.d/llvm.list
deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial main
deb-src http://apt.llvm.org/xenial/ llvm-toolchain-xenial main
deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-6.0 main
deb-src http://apt.llvm.org/xenial/ llvm-toolchain-xenial-6.0 main
deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-7 main
deb-src http://apt.llvm.org/xenial/ llvm-toolchain-xenial-7 main
deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-8 main
deb-src http://apt.llvm.org/xenial/ llvm-toolchain-xenial-8 main
deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-9 main
deb-src http://apt.llvm.org/xenial/ llvm-toolchain-xenial-9 main
EOF
fi
apt-get -y update
apt-get -y install \
clang-7 libclang-common-7-dev libclang-7-dev libllvm7 lldb-7 llvm-7 \
llvm-7-dev llvm-7-runtime clang-format-7 python-clang-7 python-lldb-7 \
liblldb-7-dev lld-7 libfuzzer-7-dev libc++-7-dev
apt-get -y install \
clang-7 libclang-common-7-dev libclang-7-dev libllvm7 llvm-7 \
llvm-7-dev llvm-7-runtime clang-format-7 python-clang-7 \
lld-7 libfuzzer-7-dev libc++-7-dev
update-alternatives --install \
/usr/bin/clang clang /usr/bin/clang-7 40 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-7 \
--slave /usr/bin/llvm-profdata llvm-profdata /usr/bin/llvm-profdata-7 \
--slave /usr/bin/asan-symbolize asan-symbolize /usr/bin/asan_symbolize-7 \
--slave /usr/bin/clang-format clang-format /usr/bin/clang-format-7 \
--slave /usr/bin/lldb lldb /usr/bin/lldb-7 \
--slave /usr/bin/lldb-server lldb-server /usr/bin/lldb-server-7 \
--slave /usr/bin/llvm-ar llvm-ar /usr/bin/llvm-ar-7 \
--slave /usr/bin/llvm-cov llvm-cov /usr/bin/llvm-cov-7 \
--slave /usr/bin/llvm-nm llvm-nm /usr/bin/llvm-nm-7
/usr/bin/clang clang /usr/bin/clang-7 40 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-7 \
--slave /usr/bin/llvm-profdata llvm-profdata /usr/bin/llvm-profdata-7 \
--slave /usr/bin/asan-symbolize asan-symbolize /usr/bin/asan_symbolize-7 \
--slave /usr/bin/llvm-symbolizer llvm-symbolizer /usr/bin/llvm-symbolizer-7 \
--slave /usr/bin/clang-format clang-format /usr/bin/clang-format-7 \
--slave /usr/bin/llvm-ar llvm-ar /usr/bin/llvm-ar-7 \
--slave /usr/bin/llvm-cov llvm-cov /usr/bin/llvm-cov-7 \
--slave /usr/bin/llvm-nm llvm-nm /usr/bin/llvm-nm-7
apt-get -y install \
clang-8 libclang-common-8-dev libclang-8-dev libllvm8 llvm-8 \
llvm-8-dev llvm-8-runtime clang-format-8 python-clang-8 \
lld-8 libfuzzer-8-dev libc++-8-dev
update-alternatives --install \
/usr/bin/clang clang /usr/bin/clang-8 20 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-8 \
--slave /usr/bin/llvm-profdata llvm-profdata /usr/bin/llvm-profdata-8 \
--slave /usr/bin/asan-symbolize asan-symbolize /usr/bin/asan_symbolize-8 \
--slave /usr/bin/llvm-symbolizer llvm-symbolizer /usr/bin/llvm-symbolizer-8 \
--slave /usr/bin/clang-format clang-format /usr/bin/clang-format-8 \
--slave /usr/bin/llvm-ar llvm-ar /usr/bin/llvm-ar-8 \
--slave /usr/bin/llvm-cov llvm-cov /usr/bin/llvm-cov-8 \
--slave /usr/bin/llvm-nm llvm-nm /usr/bin/llvm-nm-8
update-alternatives --auto clang
if [ "${CI_USE}" = true ] ; then
apt-get -y install \
clang-9 libclang-common-9-dev libclang-9-dev libllvm9 llvm-9 \
llvm-9-dev llvm-9-runtime clang-format-9 python-clang-9 \
lld-9 libfuzzer-9-dev libc++-9-dev
update-alternatives --install \
/usr/bin/clang clang /usr/bin/clang-9 20 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-9 \
--slave /usr/bin/llvm-profdata llvm-profdata /usr/bin/llvm-profdata-9 \
--slave /usr/bin/asan-symbolize asan-symbolize /usr/bin/asan_symbolize-9 \
--slave /usr/bin/llvm-symbolizer llvm-symbolizer /usr/bin/llvm-symbolizer-9 \
--slave /usr/bin/clang-format clang-format /usr/bin/clang-format-9 \
--slave /usr/bin/llvm-ar llvm-ar /usr/bin/llvm-ar-9 \
--slave /usr/bin/llvm-cov llvm-cov /usr/bin/llvm-cov-9 \
--slave /usr/bin/llvm-nm llvm-nm /usr/bin/llvm-nm-9
# only install latest lldb
apt-get -y install lldb-9 python-lldb-9 liblldb-9-dev
update-alternatives --install \
/usr/bin/lldb lldb /usr/bin/lldb-9 50 \
--slave /usr/bin/lldb-server lldb-server /usr/bin/lldb-server-9 \
--slave /usr/bin/lldb-argdumper lldb-argdumper /usr/bin/lldb-argdumper-9 \
--slave /usr/bin/lldb-instr lldb-instr /usr/bin/lldb-instr-9 \
--slave /usr/bin/lldb-mi lldb-mi /usr/bin/lldb-mi-9
update-alternatives --auto clang
fi
apt-get -y autoremove

View File

@@ -6,12 +6,15 @@ builds, including docker based builds for those distributions, please consult
the [rippled-package-builder](https://github.com/ripple/rippled-package-builder)
repository.
Development is regularly done on Ubuntu 16.04 or later. For non Ubuntu
distributions, the steps below should work be installing the appropriate
dependencies using that distribution's package management tools.
Note: Ubuntu 16.04 users may need to update their compiler (see the dependencies
section). For non Ubuntu distributions, the steps below should work be
installing the appropriate dependencies using that distribution's package
management tools.
## Dependencies
gcc-7 or later is required.
Use `apt-get` to install the dependencies provided by the distribution
```
@@ -91,7 +94,7 @@ export BOOST_ROOT=~/boost_1_70_0
Alternatively, you can add `DBOOST_ROOT=~/boost_1_70_0` to the command line when
invoking `cmake`.
### Generate and Build
### Generate Configuration
All builds should be done in a separate directory from the source tree root
(a subdirectory is fine). For example, from the root of the ripple source tree:
@@ -107,6 +110,10 @@ followed by:
cmake -DCMAKE_BUILD_TYPE=Debug ..
```
If your operating system does not provide static libraries (Arch Linux, and
Manjaro Linux, for example), you must configure a non-static build by adding
`-Dstatic=OFF` to the above cmake line.
`CMAKE_BUILD_TYPE` can be changed as desired for `Debug` vs.
`Release` builds (all four standard cmake build types are supported).
@@ -115,20 +122,6 @@ To select a different compiler (most likely gcc will be found by default), pass
`-DCMAKE_CXX_COMPILER=</path/to/cxx-compiler>` when configuring. If you prefer,
you can instead set `CC` and `CXX` environment variables which cmake will honor.
Once you have generated the build system, you can run the build via cmake:
```
cmake --build . -- -j <parallel jobs>
```
the `-j` parameter in this example tells the build tool to compile several
files in parallel. This value should be chosen roughly based on the number of
cores you have available and/or want to use for building.
When the build completes succesfully, you will have a `rippled` executable in
the current directory, which can be used to connect to the network (when
properly configured) or to run unit tests.
#### Options During Configuration:
The CMake file defines a number of configure-time options which can be
@@ -150,6 +143,23 @@ testing and running.
Several other infrequently used options are available - run `ccmake` or
`cmake-gui` for a list of all options.
### Build
Once you have generated the build system, you can run the build via cmake:
```
cmake --build . -- -j <parallel jobs>
```
the `-j` parameter in this example tells the build tool to compile several
files in parallel. This value should be chosen roughly based on the number of
cores you have available and/or want to use for building.
When the build completes succesfully, you will have a `rippled` executable in
the current directory, which can be used to connect to the network (when
properly configured) or to run unit tests.
#### Optional Installation
The rippled cmake build supports an installation target that will install

File diff suppressed because it is too large Load Diff

704
Jenkinsfile vendored
View File

@@ -1,704 +0,0 @@
#!/usr/bin/env groovy
import groovy.json.JsonOutput
import java.text.*
all_status = [:]
commit_id = ''
git_fork = 'ripple'
git_repo = 'rippled'
//
// this is not the actual token, but an ID/key into the jenkins
// credential store which httpRequest can access.
//
github_cred = '6bd3f3b9-9a35-493e-8aef-975403c82d3e'
//
// root API url for our repo (default, overriden below)
//
github_api = 'https://api.github.com/repos/ripple/rippled'
try {
stage ('Startup Checks') {
// here we check the commit author against collaborators
// we need a node to do this because readJSON requires
// a filesystem (although we just pass it text)
node {
checkout scm
commit_id = getCommitID()
//
// NOTE this getUserRemoteConfigs call requires a one-time
// In-process Script Approval (configure jenkins). We need this
// to detect the remote repo to interact with via the github API.
//
def remote_url = scm.getUserRemoteConfigs()[0].getUrl()
if (remote_url) {
echo "GIT URL scm: $remote_url"
git_fork = remote_url.tokenize('/')[2]
git_repo = remote_url.tokenize('/')[3].split('\\.')[0]
echo "GIT FORK: $git_fork"
echo "GIT REPO: $git_repo"
github_api = "https://api.github.com/repos/${git_fork}/${git_repo}"
echo "API URL REPO: $github_api"
}
if (env.CHANGE_AUTHOR) {
def collab_found = false;
//
// this means we have some sort of PR , so verify the author
//
echo "CHANGE AUTHOR ---> $CHANGE_AUTHOR"
echo "CHANGE TARGET ---> $CHANGE_TARGET"
echo "CHANGE ID ---> $CHANGE_ID"
//
// check the commit author against collaborators
// we need a node to do this because readJSON requires
// a filesystem (although we just pass it text)
//
def response = httpRequest(
timeout: 10,
authentication: github_cred,
url: "${github_api}/collaborators")
def collab_data = readJSON(
text: response.content)
for (collaborator in collab_data) {
if (collaborator['login'] == "$CHANGE_AUTHOR") {
echo "$CHANGE_AUTHOR is a collaborator!"
collab_found = true;
break;
}
}
if (! collab_found) {
echo "$CHANGE_AUTHOR is not a collaborator - waiting for manual approval."
sendToSlack("A <${env.BUILD_URL}|jenkins job (PR)> is waiting for approval - please review.")
try {
httpRequest(
timeout: 10,
authentication: github_cred,
url: getCommentURL(),
contentType: 'APPLICATION_JSON',
httpMode: 'POST',
requestBody: JsonOutput.toJson([
body: """
**Thank you** for your submission. It will be reviewed soon and submitted for processing in CI.
"""
])
)
}
catch (e) {
echo 'had a problem interacting with github...comments are probably not updated'
}
try {
input (
message: "User $CHANGE_AUTHOR has submitted PR #$CHANGE_ID. " +
"**Please review** the changes for any CI/security concerns " +
"and then decide whether to proceed with building.")
}
catch(e) {
def user = e.getCauses()[0].getUser().toString()
all_status['startup'] = [
false,
'Approval Check',
"Build aborted by [${user}]",
"[console](${env.BUILD_URL}/console)"]
error "Aborted by: [${user}]"
}
}
}
}
// UNCOMMENT the following if we want one message for every job...
//node('rippled-dev') {
// sendToSlack("<${env.BUILD_URL}|Job ${env.BUILD_TAG}> has started.")
//}
}
stage ('Parallel Build') {
String[][] variants = [
['gcc.Release' ,'-Dassert=ON' ,'MANUAL_TESTS=true' ],
['gcc.Debug' ,'-Dcoverage=ON' ,'TARGET=coverage_report', 'SKIP_TESTS=true'],
['docs' ,'' ,'TARGET=docs' ],
['msvc.Debug' ],
['msvc.Debug' ,'' ,'NINJA_BUILD=true' ],
['msvc.Debug' ,'-Dunity=OFF' ],
['msvc.Release' ],
['clang.Debug' ],
['clang.Debug' ,'-Dunity=OFF' ],
['gcc.Debug' ],
['gcc.Debug' ,'-Dunity=OFF' ],
['clang.Release' ,'-Dassert=ON' ],
['gcc.Release' ,'-Dassert=ON' ],
['gcc.Debug' ,'-Dstatic=OFF' ],
['gcc.Debug' ,'-Dstatic=OFF -DBUILD_SHARED_LIBS=ON' ],
['gcc.Debug' ,'' ,'NINJA_BUILD=true' ],
['clang.Debug' ,'-Dunity=OFF -Dsan=address' ,'PARALLEL_TESTS=false', 'DEBUGGER=false'],
['clang.Debug' ,'-Dunity=OFF -Dsan=undefined' ,'PARALLEL_TESTS=false'],
// TODO - tsan runs currently fail/hang
//['clang.Debug' ,'-Dunity=OFF -Dsan=thread' ,'PARALLEL_TESTS=false'],
]
// create a map of all builds
// that we want to run. The map
// is string keys and node{} object values
def builds = [:]
for (int index = 0; index < variants.size(); index++) {
def bldtype = variants[index][0]
def cmake_extra = variants[index].size() > 1 ? variants[index][1] : ''
def bldlabel = bldtype + cmake_extra
def extra_env = variants[index].size() > 2 ? variants[index][2..-1] : []
for (int j = 0; j < extra_env.size(); j++) {
bldlabel += "_" + extra_env[j]
}
bldlabel = bldlabel.replace('-', '_')
bldlabel = bldlabel.replace(' ', '')
bldlabel = bldlabel.replace('=', '_')
def compiler = getFirstPart(bldtype)
def config = getSecondPart(bldtype)
def target = 'install' // currently ignored for windows builds
if (compiler == 'docs') {
compiler = 'gcc'
config = 'Release'
target = 'docs'
}
def cc =
(compiler == 'clang') ? '/opt/llvm-5.0.1/bin/clang' : 'gcc'
def cxx =
(compiler == 'clang') ? '/opt/llvm-5.0.1/bin/clang++' : 'g++'
def ucc = isNoUnity(cmake_extra) ? 'true' : 'false'
def node_type =
(compiler == 'msvc') ? 'rippled-win' : 'rippled-dev'
// the default disposition for parallel test..disabled
// for coverage, enabled otherwise. Can still be overridden
// by explicitly setting with extra env settings above.
def pt = isCoverage(cmake_extra) ? 'false' : 'true'
def max_minutes = 25
def env_vars = [
"TARGET=${target}",
"BUILD_TYPE=${config}",
"COMPILER=${compiler}",
"PARALLEL_TESTS=${pt}",
'BUILD=cmake',
"MAX_TIME=${max_minutes}m",
"BUILD_DIR=${bldlabel}",
"CMAKE_EXTRA_ARGS=-Dwerr=ON ${cmake_extra}",
'VERBOSE_BUILD=true']
builds[bldlabel] = {
node(node_type) {
checkout scm
dir ('build') {
deleteDir()
}
def cdir = upDir(pwd())
echo "BASEDIR: ${cdir}"
echo "COMPILER: ${compiler}"
echo "BUILD_TYPE: ${config}"
echo "USE_CC: ${ucc}"
env_vars.addAll([
"NIH_CACHE_ROOT=${cdir}/.nih_c"])
if (compiler == 'msvc') {
env_vars.addAll([
'BOOST_ROOT=c:\\lib\\boost_1_70',
'PROJECT_NAME=rippled',
'MSBUILDDISABLENODEREUSE=1', // this ENV setting is probably redundant since we also pass /nr:false to msbuild
'OPENSSL_ROOT=c:\\OpenSSL-Win64'])
}
else {
env_vars.addAll([
'NINJA_BUILD=false',
"CCACHE_BASEDIR=${cdir}",
'PLANTUML_JAR=/opt/plantuml/plantuml.jar',
'APP_ARGS=--unittest-ipv6',
'CCACHE_NOHASHDIR=true',
"CC=${cc}",
"CXX=${cxx}",
'LCOV_ROOT=""',
'PATH+CMAKE_BIN=/opt/local/cmake',
'GDB_ROOT=/opt/local/gdb',
'BOOST_ROOT=/opt/local/boost_1_70_0',
"USE_CCACHE=${ucc}"])
}
if (extra_env.size() > 0) {
env_vars.addAll(extra_env)
}
// try to figure out codecov token to use. Look for
// MY_CODECOV_TOKEN id first so users can set that
// on job scope but then default to RIPPLED_CODECOV_TOKEN
// which should be globally scoped
def codecov_token = ''
try {
withCredentials( [string( credentialsId: 'MY_CODECOV_TOKEN', variable: 'CODECOV_TOKEN')]) {
codecov_token = env.CODECOV_TOKEN
}
}
catch (e) {
// this might throw when MY_CODECOV_TOKEN doesn't exist
}
if (codecov_token == '') {
withCredentials( [string( credentialsId: 'RIPPLED_CODECOV_TOKEN', variable: 'CODECOV_TOKEN')]) {
codecov_token = env.CODECOV_TOKEN
}
}
env_vars.addAll(["CODECOV_TOKEN=${codecov_token}"])
withEnv(env_vars) {
myStage(bldlabel)
def thrown = '';
try {
timeout(
time: max_minutes * 2,
units: 'MINUTES')
{
if (compiler == 'msvc') {
powershell "Remove-Item -Path \"${bldlabel}.txt\" -Force -ErrorAction Ignore"
// we capture stdout to variable because I could
// not figure out how to make powershell redirect internally
output = powershell (
returnStdout: true,
script: windowsBuildCmd())
// if the powershell command fails (has nonzero exit)
// then the command above throws, we don't get our output,
// and we never create this output file.
// SEE https://issues.jenkins-ci.org/browse/JENKINS-44930
// Alternatively, figure out how to reliably redirect
// all output above to a file (Start/Stop transcript does not work)
writeFile(
file: "${bldlabel}.txt",
text: output)
}
else {
sh "rm -fv ${bldlabel}.txt"
// execute the bld command in a redirecting shell
// to capture output
sh redhatBuildCmd(bldlabel)
}
}
}
catch(e) {
thrown = "${e}"
throw e
}
finally {
if (bldtype == 'docs') {
publishHTML(
allowMissing: true,
alwaysLinkToLastBuild: true,
keepAll: true,
reportName: 'Doxygen',
reportDir: "build/${bldlabel}/html_doc",
reportFiles: 'index.html')
}
if (isCoverage(cmake_extra)) {
publishHTML(
allowMissing: true,
alwaysLinkToLastBuild: false,
keepAll: true,
reportName: 'Coverage',
reportDir: "build/${bldlabel}/coverage",
reportFiles: 'index.html')
}
def envs = ''
for (int j = 0; j < extra_env.size(); j++) {
envs += ", <br/>" + extra_env[j]
}
def cmake_txt = cmake_extra
if (cmake_txt != '') {
cmake_txt = " <br/>" + cmake_txt
}
def st = reportStatus(bldlabel, bldtype + cmake_txt + envs, env.BUILD_URL, thrown)
lock('rippled_dev_status') {
all_status[bldlabel] = st
}
if (thrown == '') {
assert st[0] : "Unit Test Failures"
}
} //try-catch-finally
} //withEnv
} //node
} //builds item
} //for variants
// this actually executes all the builds we just defined
// above, in parallel as slaves are available
parallel builds
}
}
finally {
// anything here should run always...
stage ('Final Status') {
node {
def results = makeResultText()
try {
def res = getCommentID() //get array return b/c jenkins does not allow multiple direct return/assign
def comment_id = res[0]
def url_comment = res[1]
def mode = 'PATCH'
if (comment_id == 0) {
echo 'no existing status comment found'
mode = 'POST'
}
def body = JsonOutput.toJson([
body: results
])
response = httpRequest(
timeout: 10,
authentication: github_cred,
url: url_comment,
contentType: 'APPLICATION_JSON',
httpMode: mode,
requestBody: body)
}
catch (e) {
echo 'had a problem interacting with github...status is probably not updated'
}
}
}
}
// ---------------
// util functions
// ---------------
def myStage(name) {
echo """
+++++++++++++++++++++++++++++++++++++++++
>> building ${name}
+++++++++++++++++++++++++++++++++++++++++
"""
}
def printGitInfo(id, log) {
echo """
+++++++++++++++++++++++++++++++++++++++++
>> Building commit ID ${id}
>>
${log}
+++++++++++++++++++++++++++++++++++++++++
"""
}
def makeResultText () {
def start_time = new Date()
def sdf = new SimpleDateFormat('yyyyMMdd - HH:mm:ss')
def datestamp = sdf.format(start_time)
def results = """
## Jenkins Build Summary
Built from [this commit](https://github.com/${git_fork}/${git_repo}/commit/${commit_id})
Built at __${datestamp}__
### Test Results
Build Type | Log | Result | Status
---------- | --- | ------ | ------
"""
for ( e in all_status) {
results += e.value[1] + ' | ' + e.value[3] + ' | ' + e.value[2] + ' | ' +
(e.value[0] ? 'PASS :white_check_mark: ' : 'FAIL :red_circle: ') + '\n'
}
results += '\n'
echo 'FINAL BUILD RESULTS'
echo results
results
}
def getCommentURL () {
def url_c = ''
if (env.CHANGE_ID && env.CHANGE_ID ==~ /\d+/) {
//
// CHANGE_ID indicates we are building a PR
// find PR comments
//
def resp = httpRequest(
timeout: 10,
authentication: github_cred,
url: "${github_api}/pulls/$CHANGE_ID")
def result = readJSON(text: resp.content)
//
// follow issue comments link
//
url_c = result['_links']['issue']['href'] + '/comments'
}
else {
//
// if not a PR, just search comments for our commit ID
//
url_c =
"${github_api}/commits/${commit_id}/comments"
}
url_c
}
def getCommentID () {
def url_c = getCommentURL()
def response = httpRequest(
timeout: 10,
authentication: github_cred,
url: url_c)
def data = readJSON(text: response.content)
def comment_id = 0
// see if we can find and existing comment here with
// a heading that matches ours...
for (comment in data) {
if (comment['body'] =~ /(?m)^##\s+Jenkins Build/) {
comment_id = comment['id']
echo "existing status comment ${comment_id} found"
url_c = comment['url']
break;
}
}
[comment_id, url_c]
}
def getCommitID () {
def cid = sh (
script: 'git rev-parse HEAD',
returnStdout: true)
cid = cid.trim()
echo "commit ID is ${cid}"
commit_log = sh (
script: "git show --name-status ${cid}",
returnStdout: true)
printGitInfo (cid, commit_log)
cid
}
@NonCPS
def getResults(text, label) {
// example:
/// 194.5s, 154 suites, 948 cases, 360485 tests total, 0 failures
// or build log format:
// [msvc.release] 71.3s, 162 suites, 995 cases, 318901 tests total, 1 failure
def matcher =
text == '' ?
manager.getLogMatcher(/\[${label}\].+?(\d+) case[s]?, (\d+) test[s]? total, (\d+) (failure(s?))/) :
text =~ /(\d+) case[s]?, (\d+) test[s]? total, (\d+) (failure(s?))/
matcher ? matcher[0][1] + ' cases, ' + matcher[0][3] + ' failed' : 'no test results'
}
@NonCPS
def getFailures(text, label) {
// [see above for format]
def matcher =
text == '' ?
manager.getLogMatcher(/\[${label}\].+?(\d+) test[s]? total, (\d+) (failure(s?))/) :
text =~ /(\d+) test[s]? total, (\d+) (failure(s?))/
// if we didn't match, then return -1 since something is
// probably wrong, e.g. maybe the build failed...
matcher ? matcher[0][2] as Integer : -1i
}
@NonCPS
def getTime(text, label) {
// look for text following a label 'real' for
// wallclock time. Some `time`s report fractional
// seconds and we can omit those in what we report
def matcher =
text == '' ?
manager.getLogMatcher(/(?m)^\[${label}\]\s+real\s+(.+)\.(\d+?)[s]?/) :
text =~ /(?m)^real\s+(.+)\.(\d+?)[s]?/
if (matcher) {
return matcher[0][1] + 's'
}
// alternatively, look for powershell elapsed time
// format, e.g. :
// TotalSeconds : 523.2140529
def matcher2 =
text == '' ?
manager.getLogMatcher(/(?m)^\[${label}\]\s+TotalSeconds\s+:\s+(\d+)\.(\d+?)?/) :
text =~ /(?m)^TotalSeconds\s+:\s+(\d+)\.(\d+?)?/
matcher2 ? matcher2[0][1] + 's' : 'n/a'
}
@NonCPS
def getFirstPart(bld) {
def matcher = bld =~ /^(.+?)\.(.+)$/
matcher ? matcher[0][1] : bld
}
@NonCPS
def isNoUnity(bld) {
def matcher = bld =~ /-Dunity=(off|OFF)/
matcher ? true : false
}
@NonCPS
def isCoverage(bld) {
def matcher = bld =~ /-Dcoverage=(on|ON)/
matcher ? true : false
}
@NonCPS
def getSecondPart(bld) {
def matcher = bld =~ /^(.+?)\.(.+)$/
matcher ? matcher[0][2] : bld
}
// because I can't seem to find path manipulation
// functions in groovy....
@NonCPS
def upDir(path) {
def matcher = path =~ /^(.+)[\/\\](.+?)/
matcher ? matcher[0][1] : path
}
def sendToSlack(message) {
try {
withCredentials( [string( credentialsId: 'RIPPLED_SLACK_INCOMING_URL', variable: 'SLACK_URL')]) {
// I was unable to make httpRequest method work with the
// formdata required by slack API, so resorting
// to curl commands...
sh '''\
CONTENT=$(tr -d '[\n]' <<JSON
payload={
"channel": "#cpp-notifications",
"username": "JenkinsCI",
"text": "''' + message + '''",
"icon_emoji": ":jenkins:"}
JSON
)
curl ${SLACK_URL} --data-urlencode "${CONTENT}"
'''
}
}
catch (e) {
echo "had a problem posting to slack: ${e}"
}
}
// the shell command used for building on redhat
def redhatBuildCmd(bldlabel) {
'''\
#!/bin/bash
set -ex
log_file=''' + "${bldlabel}.txt" + '''
exec 3>&1 1>>${log_file} 2>&1
ccache -s
source /opt/rh/devtoolset-7/enable
/usr/bin/time -p ./bin/ci/ubuntu/build-and-test.sh 2>&1
ccache -s
'''
}
// the powershell command used for building
def windowsBuildCmd() {
'''
# Enable streams 3-6
$WarningPreference = 'Continue'
$VerbosePreference = 'Continue'
$DebugPreference = 'Continue'
$InformationPreference = 'Continue'
Invoke-BatchFile "${env:ProgramFiles(x86)}\\Microsoft Visual Studio\\2017\\Community\\VC\\Auxiliary\\Build\\vcvarsall.bat" x86_amd64
Get-ChildItem env:* | Sort-Object name
cl
cmake --version
New-Item -ItemType Directory -Force -Path "build/$env:BUILD_DIR" -ErrorAction Stop
$sw = [Diagnostics.Stopwatch]::StartNew()
try {
Push-Location "build/$env:BUILD_DIR"
if ($env:NINJA_BUILD -eq "true") {
Invoke-Expression "& cmake -G`"Ninja`" -DCMAKE_BUILD_TYPE=$env:BUILD_TYPE -DCMAKE_VERBOSE_MAKEFILE=ON $env:CMAKE_EXTRA_ARGS ../.."
}
else {
Invoke-Expression "& cmake -G`"Visual Studio 15 2017 Win64`" -DCMAKE_VERBOSE_MAKEFILE=ON $env:CMAKE_EXTRA_ARGS ../.."
}
if ($LastExitCode -ne 0) { throw "CMake failed" }
## as of 01/2018, DO NOT USE cmake to run the actual build step. for some
## reason, cmake spawning the build under jenkins causes MSBUILD/ninja to
## get stuck at the end of the build. Perhaps cmake is spawning
## incorrectly or failing to pass certain params
if ($env:NINJA_BUILD -eq "true") {
ninja -j $env:NUMBER_OF_PROCESSORS -v
}
else {
msbuild /fl /m /nr:false /p:Configuration="$env:BUILD_TYPE" /p:Platform=x64 /p:GenerateFullPaths=True /v:normal /nologo /clp:"ShowCommandLine;DisableConsoleColor" "$env:PROJECT_NAME.vcxproj"
}
if ($LastExitCode -ne 0) { throw "CMake build failed" }
$exe = "./$env:BUILD_TYPE/$env:PROJECT_NAME"
if ($env:NINJA_BUILD -eq "true") {
$exe = "./$env:PROJECT_NAME"
}
"Exe is at $exe"
$params = '--unittest', '--quiet', '--unittest-log'
if ($env:PARALLEL_TESTS -eq "true") {
$params = $params += "--unittest-jobs=$env:NUMBER_OF_PROCESSORS"
}
& $exe $params
if ($LastExitCode -ne 0) { throw "Unit tests failed" }
}
catch {
throw
}
finally {
$sw.Stop()
$sw.Elapsed
Pop-Location
}
'''
}
// post processing step after each build:
// * archives the log file
// * adds short description/status to build status
// * returns an array of result info to add to the all_build summary
def reportStatus(label, type, bldurl, errmsg) {
def outstr = ''
def loglink = "[console](${bldurl}/console)"
def logfile = "${label}.txt"
if ( fileExists(logfile) ) {
archiveArtifacts( artifacts: logfile )
outstr = readFile(logfile)
loglink = "[logfile](${bldurl}/artifact/${logfile})"
}
def st = getResults(outstr, label)
def time = getTime(outstr, label)
def fail_count = getFailures(outstr, label)
outstr = null
def txtcolor =
(fail_count == 0 && errmsg == '') ? 'DarkGreen' : 'Crimson'
def shortbld = label
// this is just an attempt to shorten the
// summary text label to the point of absurdity..
shortbld = shortbld.replace('Debug', 'dbg')
shortbld = shortbld.replace('Release', 'rel')
shortbld = shortbld.replace('true', 'Y')
shortbld = shortbld.replace('false', 'N')
shortbld = shortbld.replace('Dcoverage', 'cov')
shortbld = shortbld.replace('Dassert', 'asrt')
shortbld = shortbld.replace('Dunity', 'unty')
shortbld = shortbld.replace('Dsan=address', 'asan')
shortbld = shortbld.replace('Dsan=thread', 'tsan')
shortbld = shortbld.replace('Dsan=undefined', 'ubsan')
shortbld = shortbld.replace('PARALLEL_TEST', 'PL')
shortbld = shortbld.replace('MANUAL_TESTS', 'MAN')
shortbld = shortbld.replace('NINJA_BUILD', 'ninja')
shortbld = shortbld.replace('DEBUGGER', 'gdb')
shortbld = shortbld.replace('ON', 'Y')
shortbld = shortbld.replace('OFF', 'N')
def stattext = "${st}, t: ${time}"
if (fail_count <= 0 && errmsg != '') {
stattext += " [BAD EXIT]"
}
manager.addShortText(
"${shortbld}: ${stattext}",
txtcolor,
'white',
'0px',
'white')
[fail_count == 0 && errmsg == '', type, stattext, loglink]
}

View File

@@ -3,17 +3,22 @@
The XRP Ledger is a decentralized cryptographic ledger powered by a network of peer-to-peer servers. The XRP Ledger uses a novel Byzantine Fault Tolerant consensus algorithm to settle and record transactions in a secure distributed database without a central operator.
## XRP
XRP is a public, counterparty-less asset native to the XRP Ledger, and is designed to bridge the many different currencies in use worldwide. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP. Its creators gifted 80 billion XRP to a company, now called [Ripple](https://ripple.com/), to develop the XRP Ledger and its ecosystem. Ripple uses XRP the help build the Internet of Value, ushering in a world in which money moves as fast and efficiently as information does today.
XRP is a public, counterparty-free asset native to the XRP Ledger, and is designed to bridge the many different currencies in use worldwide. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP. Its creators gifted 80 billion XRP to a company, now called [Ripple](https://ripple.com/), to develop the XRP Ledger and its ecosystem. Ripple uses XRP to help build the Internet of Value, ushering in a world in which money moves as fast and efficiently as information does today.
## `rippled`
## rippled
The server software that powers the XRP Ledger is called `rippled` and is available in this repository under the permissive [ISC open-source license](LICENSE). The `rippled` server is written primarily in C++ and runs on a variety of platforms.
### Build from Source
# Key Features of the XRP Ledger
* [Linux](Builds/linux/README.md)
* [Mac](Builds/macos/README.md)
* [Windows](Builds/VisualStudio2017/README.md)
## Key Features of the XRP Ledger
- **[Censorship-Resistant Transaction Processing][]:** No single party decides which transactions succeed or fail, and no one can "roll back" a transaction after it completes. As long as those who choose to participate in the network keep it healthy, they can settle transactions in seconds.
- **[Fast, Efficient Consensus Algorithm][]:** The XRP Ledger's consensus algorithm settles transactions in 4 to 5 seconds, processing at a throughput of up to 1500 transactions per second. These properties put XRP at least an order of magnitude ahead of other top digital assets.
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. (Each XRP is subdivisible down to 6 decimal places, for a grand total of 100 quintillion _drops_ of XRP.) The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction costs.
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction costs.
- **[Responsible Software Governance][]:** A team of full-time, world-class developers at Ripple maintain and continually improve the XRP Ledger's underlying software with contributions from the open-source community. Ripple acts as a steward for the technology and an advocate for its interests, and builds constructive relationships with governments and financial institutions worldwide.
- **[Secure, Adaptable Cryptography][]:** The XRP Ledger relies on industry standard digital signature systems like ECDSA (the same scheme used by Bitcoin) but also supports modern, efficient algorithms like Ed25519. The extensible nature of the XRP Ledger's software makes it possible to add and disable algorithms as the state of the art in cryptography advances.
- **[Modern Features for Smart Contracts][]:** Features like Escrow, Checks, and Payment Channels support cutting-edge financial applications including the [Interledger Protocol](https://interledger.org/). This toolbox of advanced features comes with safety features like a process for amending the network and separate checks against invariant constraints.
@@ -29,7 +34,7 @@ The server software that powers the XRP Ledger is called `rippled` and is availa
## Source Code
[![travis-ci.org: Build Status](https://travis-ci.org/ripple/rippled.png?branch=develop)](https://travis-ci.org/ripple/rippled)
[![travis-ci.com: Build Status](https://travis-ci.com/ripple/rippled.svg?branch=develop)](https://travis-ci.com/ripple/rippled)
[![codecov.io: Code Coverage](https://codecov.io/gh/ripple/rippled/branch/develop/graph/badge.svg)](https://codecov.io/gh/ripple/rippled)
### Repository Contents

View File

@@ -1,19 +1,77 @@
# Release Notes
![Ripple](docs/images/ripple.png)
![XRP](docs/images/xrp-text-mark-black-small@2x.png)
This document contains the release notes for `rippled`, the reference server implementation of the Ripple protocol. To learn more about how to build and run a `rippled` server, visit https://ripple.com/build/rippled-setup/
This document contains the release notes for `rippled`, the reference server implementation of the Ripple protocol. To learn more about how to build, run or update a `rippled` server, visit https://xrpl.org/install-rippled.html
> **Do you work at a digital asset exchange or wallet provider?**
>
> Please [contact us](mailto:support@ripple.com). We can help guide your integration.
## Updating `rippled`
If you are using Red Hat Enterprise Linux 7 or CentOS 7, you can [update using `yum`](https://ripple.com/build/rippled-setup/#updating-rippled). For other platforms, please [compile from source](https://wiki.ripple.com/Rippled_build_instructions).
Have new ideas? Need help with setting up your node? Come visit us [here](https://github.com/ripple/rippled/issues/new/choose)
# Releases
## Version 1.5.0
The `rippled` 1.5.0 release introduces several improvements and new features, including support for gRPC API, API versioning, UNL propagation via the peer network, new RPC methods `validator_info` and `manifest`, augmented `submit` method, improved `tx` method response, improved command line parsing, improved handshake protocol, improved package building and various other minor bug fixes and improvements.
This release also introduces two new amendments: `fixQualityUpperBound` and `RequireFullyCanonicalSig`.
Several improvements to the sharding system are currently being evaluated for inclusion into the upcoming 1.6 release of `rippled`. These changes are incompatible with shards generated by previous versions of the code.
Additionally, an issue with the existing sharding engine can result in a server running versions 1.4 or 1.5 of the software to experience a deadlock and automatically restart when running with the sharding feature enabled.
At this time, the developers recommend running with sharding disabled, pending the improvements scheduled to be introduced with 1.6. For more information on how to disable sharding, please visit https://xrpl.org/configure-history-sharding.html
**New and Updated Features**
- The `RequireFullyCanonicalSig` amendment which changes the signature requirements for the XRP Ledger protocol so that non-fully-canonical signatures are no longer valid. This protects against transaction malleability on all transactions, instead of just transactions with the tfFullyCanonicalSig flag enabled. Without this amendment, a transaction is malleable if it uses a secp256k1 signature and does not have tfFullyCanonicalSig enabled. Most signing utilities enable tfFullyCanonicalSig by default, but there are exceptions. With this amendment, no single-signed transactions are malleable. (Multi-signed transactions may still be malleable if signers provide more signatures than are necessary.) All transactions must use the fully canonical form of the signature, regardless of the tfFullyCanonicalSig flag. Signing utilities that do not create fully canonical signatures are not supported. All of Ripple's signing utilities have been providing fully-canonical signatures exclusively since at least 2014. For more information. [`ec137044a`](https://github.com/ripple/rippled/commit/ec137044a014530263cd3309d81643a5a3c1fdab)
- Native [gRPC API](https://grpc.io/) support. Currently, this API provides a subset of the full `rippled` [API](https://xrpl.org/rippled-api.html). You can enable the gRPC API on your server with a new configuration stanza. [`7d867b806`](https://github.com/ripple/rippled/commit/7d867b806d70fc41fb45e3e61b719397033b272c)
- API Versioning which allows for future breaking change of RPC methods to co-exist with existing versions. [`2aa11fa41`](https://github.com/ripple/rippled/commit/2aa11fa41d4a7849ae6a5d7a11df6f367191e3ef)
- Nodes now receive and broadcast UNLs over the peer network under various conditions. [`2c71802e3`](https://github.com/ripple/rippled/commit/2c71802e389a59118024ea0152123144c084b31c)
- Augmented `submit` method to include additional details on the status of the command. [`79e9085dd`](https://github.com/ripple/rippled/commit/79e9085dd1eb72864afe841225b78ec96e72b5ca)
- Improved `tx` method response with additional details on ledgers searched. [`47501b7f9`](https://github.com/ripple/rippled/commit/47501b7f99d4103d9ad405e399169fc251161548)
- New `validator_info` method which returns information pertaining to the current validator's keys, manifest sequence, and domain. [`3578acaf0`](https://github.com/ripple/rippled/commit/3578acaf0b5f2d27ddc33f5b4cc81d21be1903ae)
- New `manifest` method which looks up manifest information for the specified key (either master or ephemeral). [`3578acaf0`](https://github.com/ripple/rippled/commit/3578acaf0b5f2d27ddc33f5b4cc81d21be1903ae)
- Introduce handshake protocol for compression negotiation (compression is not implemented at this point) and other minor improvements. [`f6916bfd4`](https://github.com/ripple/rippled/commit/f6916bfd429ce654e017ae9686cb023d9e05408b)
- Remove various old conditionals introduced by amendments. [`(51ed7db00`](https://github.com/ripple/rippled/commit/51ed7db0027ba822739bd9de6f2613f97c1b227b), [`6e4945c56)`](https://github.com/ripple/rippled/commit/6e4945c56b1a1c063b32921d7750607587ec3063)
- Add `getRippledInfo` info gathering script to `rippled` Linux packages. [`acf4b7889`](https://github.com/ripple/rippled/commit/acf4b78892074303cb1fa22b778da5e7e7eddeda)
**Bug Fixes and Improvements**
- The `fixQualityUpperBound` amendment which fixes a bug in unused code for estimating the ratio of input to output of individual steps in cross-currency payments. [`9d3626fec`](https://github.com/ripple/rippled/commit/9d3626fec5b610100f401dc0d25b9ec8e4a9a362)
- `tx` method now properly fetches all historical tx if they are incorporated into a validated ledger under rules that applied at the time. [`11cf27e00`](https://github.com/ripple/rippled/commit/11cf27e00698dbfc099b23463927d1dac829ed19)
- Fix to how `fail_hard` flag is handled with the `submit` method - transactions that are submitted with the `fail_hard` flag that result in any TER code besides tesSUCCESS is neither queued nor held. [`cd9732b47`](https://github.com/ripple/rippled/commit/cd9732b47a9d4e95bcb74e048d2c76fa118b80fb)
- Remove unused `Beast` code. [`172ead822`](https://github.com/ripple/rippled/commit/172ead822159a3c1f9b73217da4316df48851ab6)
- Lag ratchet code fix to use proper ephemeral public keys instead of the long-term master public keys.[`6529d3e6f`](https://github.com/ripple/rippled/commit/6529d3e6f7333fc5226e5aa9ae65f834cb93dfe5)
## Version 1.4.0
The `rippled` 1.4.0 release introduces several improvements and new features, including support for deleting accounts, improved peer slot management, improved CI integration and package building and support for [C++17](https://en.wikipedia.org/wiki/C%2B%2B17) and [Boost 1.71](https://www.boost.org/users/history/version_1_71_0.html). Finally, this release removes the code for the `SHAMapV2` amendment which failed to gain majority support and has been obsoleted by other improvements.
**New and Updated Features**
- The `DeletableAccounts` amendment which, if enabled, will make it possible for users to delete unused or unneeded accounts, recovering the account's reserve.
- Support for improved management of peer slots and the ability to add and removed reserved connections without requiring a restart of the server.
- Tracking and reporting of cumulative and instantaneous peer bandwidth usage.
- Preliminary support for post-processing historical shards after downloading to index their contents.
- Reporting the master public key alongside the ephemeral public key in the `validation` stream [subscriptions](https://xrpl.org/subscribe.html).
- Reporting consensus phase changes in the `server` stream [subscription](https://xrpl.org/subscribe.html).
**Bug Fixes**
- The `fixPayChanRecipientOwnerDir` amendment which corrects a minor technical flaw that would cause a payment channel to not appear in the recipient's owner directory, which made it unnecessarily difficult for users to enumerate all their payment channels.
- The `fixCheckThreading` amendment which corrects a minor technical flaw that caused checks to not be properly threaded against the account of the check's recipient.
- Respect the `ssl_verify` configuration option in the `SSLHTTPDownloader` and `HTTPClient` classes.
- Properly update the `server_state` when a server detects a disagreement between itself and the network.
- Allow Ed25519 keys to be used with the `channel_authorize` command.
## Version 1.3.1
The `rippled` 1.3.1 release improves the built-in deadlock detection code, improves logging during process startup, changes the package build pipeline and improves the build documentation.
**New and Updated Features**
This release has no new features.
**Bug Fixes**
- Add a LogicError when a deadlock is detected (355a7b04)
- Improve logging during process startup (7c24f7b1)
## Version 1.3.0
The `rippled` 1.3.0 release introduces several new features and overall improvements to the codebase, including the `fixMasterKeyAsRegularKey` amendment, code to adjust the timing of the consensus process and support for decentralized validator domain verification. The release also includes miscellaneous improvements including in the transaction censorship detection code, transaction validation code, manifest parsing code, configuration file parsing code, log file rotation code, and in the build, continuous integration, testing and package building pipelines.
@@ -24,7 +82,7 @@ The `rippled` 1.3.0 release introduces several new features and overall improvem
**Bug Fixes**
- Improve ledger trie ancestry tracking to reduce unnecessary error messages.
- More efficient detection of dry paths in the payment engine. Although not a transaction-breaking change, this should reduces spurious error messages in the log files.
- More efficient detection of dry paths in the payment engine. Although not a transaction-breaking change, this should reduce spurious error messages in the log files.
## Version 1.2.4

View File

@@ -1,12 +1,15 @@
# Set environment variables.
environment:
# We bundle up protoc.exe and only the parts of boost and openssl we need so
# We bundle up only the parts of boost and openssl we need so
# that it's a small download. We also use appveyor's free cache, avoiding fees
# downloading from S3 each time.
# TODO: script to create this package.
RIPPLED_DEPS_PATH: rippled_deps17.04
RIPPLED_DEPS_URL: https://ripple.github.io/Downloads/appveyor/%RIPPLED_DEPS_PATH%.zip
RIPPLED_DEPS_PATH: rippled_deps17.05
RIPPLED_DEPS_BASE_URL: https://ripple.github.io/Downloads/appveyor
RIPPLED_OPENSSL: rippled_deps.openssl.1.0.2j.zip
RIPPLED_BOOST: rippled_deps.boost.1.70.zip
RIPPLED_BOOST_STAGE: rippled_deps.boost.stage.1.70.zip
# CMake honors these environment variables, setting the include/lib paths.
BOOST_ROOT: C:/%RIPPLED_DEPS_PATH%/boost
@@ -36,25 +39,34 @@ cache:
shallow_clone: true
install:
# We want protoc.exe on PATH.
- SET PATH=C:/%RIPPLED_DEPS_PATH%;%PATH%
# Download dependencies if appveyor didn't restore them from the cache.
# Use 7zip to unzip.
- ps: |
if (-not(Test-Path 'C:/$env:RIPPLED_DEPS_PATH')) {
echo "Download from $env:RIPPLED_DEPS_URL"
Start-FileDownload "$env:RIPPLED_DEPS_URL"
7z x "$($env:RIPPLED_DEPS_PATH).zip" -oC:\ -y > $null
if ($LastExitCode -ne 0) { throw "7z failed" }
if (-not(Test-Path "C:/$env:RIPPLED_DEPS_PATH")) {
$files = @(
"$env:RIPPLED_BOOST",
"$env:RIPPLED_BOOST_STAGE",
"$env:RIPPLED_OPENSSL"
)
For ($i=0; $i -lt $files.Length; $i++) {
$file = $files[$i]
$url = "$env:RIPPLED_DEPS_BASE_URL/$file"
echo "Download $file from $url"
Start-FileDownload "$url"
7z x "$file" -o"C:\$env:RIPPLED_DEPS_PATH" -y > $null
if ($LastExitCode -ne 0) { throw "7z failed" }
}
}
else
{
"Dependencies are in cache"
ls "C:/$env:RIPPLED_DEPS_PATH"
}
# Newer DEPS include a versions file.
# Dump it so we can verify correct behavior.
- ps: |
if (Test-Path "C:/$env:RIPPLED_DEPS_PATH/versions.txt") {
cat "C:/$env:RIPPLED_DEPS_PATH/versions.txt"
}
cat "C:/$env:RIPPLED_DEPS_PATH/version*.txt"
# TODO: This is giving me grief
# artifacts:

View File

@@ -14,6 +14,9 @@ BUILD_TYPE=${BUILD_TYPE:-Debug}
# the default by setting `$CMAKE_ARGS` to the empty string.
CMAKE_ARGS=${CMAKE_ARGS-'-Dwerr=ON'}
# https://gitlab.kitware.com/cmake/cmake/issues/18865
CMAKE_ARGS="-DBoost_NO_BOOST_CMAKE=ON ${CMAKE_ARGS}"
if [[ ${COMPILER} == 'gcc' ]]; then
export CC='gcc'
export CXX='g++'

View File

@@ -1,18 +1,18 @@
#!/bin/bash -u
# We use set -e and bash with -u to bail on first non zero exit code of any
# processes launched or upon any unbound variable.
# We use set -x to print commands before running them to help with
# debugging.
#!/usr/bin/env bash
set -ex
function version_ge() { test "$(echo "$@" | tr " " "\n" | sort -rV | head -n 1)" == "$1"; }
__dirname=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
echo "using CC: ${CC}"
"${CC}" --version
export CC
COMPNAME=$(basename $CC)
echo "using CXX: ${CXX:-notset}"
if [[ $CXX ]]; then
"${CXX}" --version
export CXX
"${CXX}" --version
export CXX
fi
: ${BUILD_TYPE:=Debug}
echo "BUILD TYPE: ${BUILD_TYPE}"
@@ -20,150 +20,231 @@ echo "BUILD TYPE: ${BUILD_TYPE}"
: ${TARGET:=install}
echo "BUILD TARGET: ${TARGET}"
# Ensure APP defaults to rippled if it's not set.
: ${APP:=rippled}
echo "using APP: ${APP}"
JOBS=${NUM_PROCESSORS:-2}
if [[ ${TRAVIS:-false} != "true" ]]; then
JOBS=$((JOBS+1))
JOBS=$((JOBS+1))
fi
if [[ ! -z "${CMAKE_EXE:-}" ]] ; then
export PATH="$(dirname ${CMAKE_EXE}):$PATH"
fi
if [ -x /usr/bin/time ] ; then
: ${TIME:="Duration: %E"}
export TIME
time=/usr/bin/time
: ${TIME:="Duration: %E"}
export TIME
time=/usr/bin/time
else
time=
time=
fi
if [[ -z "${MAX_TIME:-}" ]] ; then
timeout_cmd=""
else
timeout_cmd="timeout ${MAX_TIME}"
fi
echo "cmake building ${APP}"
echo "Building rippled"
: ${CMAKE_EXTRA_ARGS:=""}
if [[ ${NINJA_BUILD:-} == true ]]; then
CMAKE_EXTRA_ARGS+=" -G Ninja"
CMAKE_EXTRA_ARGS+=" -G Ninja"
fi
coverage=false
if [[ "${TARGET}" == "coverage_report" ]] ; then
echo "coverage option detected."
coverage=true
export PATH=$PATH:${LCOV_ROOT}/usr/bin
fi
cmake --version
CMAKE_VER=$(cmake --version | cut -d " " -f 3 | head -1)
#
# allow explicit setting of the name of the build
# dir, otherwise default to the compiler.build_type
#
: "${BUILD_DIR:=${COMPNAME}.${BUILD_TYPE}}"
BUILDARGS=" -j${JOBS}"
if [[ ${VERBOSE_BUILD:-} == true ]]; then
CMAKE_EXTRA_ARGS+=" -DCMAKE_VERBOSE_MAKEFILE=ON"
# TODO: if we use a different generator, this
# option to build verbose would need to change:
if [[ ${NINJA_BUILD:-} == true ]]; then
BUILDARGS+=" -v"
else
BUILDARGS+=" verbose=1"
fi
BUILDARGS="--target ${TARGET}"
BUILDTOOLARGS=""
if version_ge $CMAKE_VER "3.12.0" ; then
BUILDARGS+=" --parallel"
fi
if [[ ${NINJA_BUILD:-} == false ]]; then
if version_ge $CMAKE_VER "3.12.0" ; then
BUILDARGS+=" ${JOBS}"
else
BUILDTOOLARGS+=" -j ${JOBS}"
fi
fi
if [[ ${VERBOSE_BUILD:-} == true ]]; then
CMAKE_EXTRA_ARGS+=" -DCMAKE_VERBOSE_MAKEFILE=ON"
if version_ge $CMAKE_VER "3.14.0" ; then
BUILDARGS+=" --verbose"
else
if [[ ${NINJA_BUILD:-} == false ]]; then
BUILDTOOLARGS+=" verbose=1"
else
BUILDTOOLARGS+=" -v"
fi
fi
fi
if [[ ${USE_CCACHE:-} == true ]]; then
echo "using ccache with basedir [${CCACHE_BASEDIR:-}]"
CMAKE_EXTRA_ARGS+=" -DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache"
echo "using ccache with basedir [${CCACHE_BASEDIR:-}]"
CMAKE_EXTRA_ARGS+=" -DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache"
fi
if [ -d "build/${BUILD_DIR}" ]; then
rm -rf "build/${BUILD_DIR}"
rm -rf "build/${BUILD_DIR}"
fi
mkdir -p "build/${BUILD_DIR}"
pushd "build/${BUILD_DIR}"
# generate
${time} cmake ../.. -DCMAKE_BUILD_TYPE=${BUILD_TYPE} ${CMAKE_EXTRA_ARGS}
# build
export DESTDIR=$(pwd)/_INSTALLED_
time ${timeout_cmd} cmake --build . --target ${TARGET} -- $BUILDARGS
${time} eval cmake --build . ${BUILDARGS} -- ${BUILDTOOLARGS}
if [[ ${TARGET} == "docs" ]]; then
## mimic the standard test output for docs build
## to make controlling processes like jenkins happy
if [ -f html_doc/index.html ]; then
echo "1 case, 1 test total, 0 failures"
else
echo "1 case, 1 test total, 1 failures"
fi
exit
## mimic the standard test output for docs build
## to make controlling processes like jenkins happy
if [ -f html_doc/index.html ]; then
echo "1 case, 1 test total, 0 failures"
else
echo "1 case, 1 test total, 1 failures"
fi
exit
fi
popd
export APP_PATH="$PWD/build/${BUILD_DIR}/${APP}"
if [[ "${TARGET}" == "validator-keys" ]] ; then
export APP_PATH="$PWD/build/${BUILD_DIR}/validator-keys/validator-keys"
else
export APP_PATH="$PWD/build/${BUILD_DIR}/rippled"
fi
echo "using APP_PATH: ${APP_PATH}"
# See what we've actually built
ldd ${APP_PATH}
function join_by { local IFS="$1"; shift; echo "$*"; }
# This is a list of manual tests
# in rippled that we want to run
declare -a manual_tests=(
"beast.chrono.abstract_clock"
"beast.unit_test.print"
"ripple.NodeStore.Timing"
"ripple.app.Flow_manual"
"ripple.app.NoRippleCheckLimits"
"ripple.app.PayStrandAllPairs"
"ripple.consensus.ByzantineFailureSim"
"ripple.consensus.DistributedValidators"
"ripple.consensus.ScaleFreeSim"
"ripple.ripple_data.digest"
"ripple.tx.CrossingLimits"
"ripple.tx.FindOversizeCross"
"ripple.tx.Offer_manual"
"ripple.tx.OversizeMeta"
"ripple.tx.PlumpBook"
)
: ${APP_ARGS:=}
if [[ ${APP} == "rippled" ]]; then
if [[ ${MANUAL_TESTS:-} == true ]]; then
APP_ARGS+=" --unittest=$(join_by , "${manual_tests[@]}")"
else
APP_ARGS+=" --unittest --quiet --unittest-log"
fi
if [[ ${coverage} == false && ${PARALLEL_TESTS:-} == true ]]; then
APP_ARGS+=" --unittest-jobs ${JOBS}"
fi
if [[ "${TARGET}" == "validator-keys" ]] ; then
APP_ARGS="--unittest"
else
function join_by { local IFS="$1"; shift; echo "$*"; }
# This is a list of manual tests
# in rippled that we want to run
# ORDER matters here...sorted in approximately
# descending execution time (longest running tests at top)
declare -a manual_tests=(
'ripple.ripple_data.digest'
'ripple.tx.Offer_manual'
'ripple.app.PayStrandAllPairs'
'ripple.tx.CrossingLimits'
'ripple.tx.PlumpBook'
'ripple.app.Flow_manual'
'ripple.tx.OversizeMeta'
'ripple.consensus.DistributedValidators'
'ripple.app.NoRippleCheckLimits'
'ripple.NodeStore.Timing'
'ripple.consensus.ByzantineFailureSim'
'beast.chrono.abstract_clock'
'beast.unit_test.print'
)
if [[ ${TRAVIS:-false} != "true" ]]; then
# these two tests cause travis CI to run out of memory.
# TODO: investigate possible workarounds.
manual_tests=(
'ripple.consensus.ScaleFreeSim'
'ripple.tx.FindOversizeCross'
"${manual_tests[@]}"
)
fi
if [[ ${MANUAL_TESTS:-} == true ]]; then
APP_ARGS+=" --unittest=$(join_by , "${manual_tests[@]}")"
else
APP_ARGS+=" --unittest --quiet --unittest-log"
fi
if [[ ${coverage} == false && ${PARALLEL_TESTS:-} == true ]]; then
APP_ARGS+=" --unittest-jobs ${JOBS}"
fi
if [[ ${IPV6_TESTS:-} == true ]]; then
APP_ARGS+=" --unittest-ipv6"
fi
fi
if [[ ${coverage} == true ]]; then
# Push the results (lcov.info) to codecov
codecov -X gcov # don't even try and look for .gcov files ;)
find . -name "*.gcda" | xargs rm -f
if [[ ${coverage} == true && $CC =~ ^gcc ]]; then
# Push the results (lcov.info) to codecov
codecov -X gcov # don't even try and look for .gcov files ;)
find . -name "*.gcda" | xargs rm -f
fi
if [[ ${SKIP_TESTS:-} == true ]]; then
echo "skipping tests."
exit
echo "skipping tests."
exit
fi
if [[ ${DEBUGGER:-true} == "true" && -v GDB_ROOT && -x ${GDB_ROOT}/bin/gdb ]]; then
${GDB_ROOT}/bin/gdb -v
# Execute unit tests under gdb, printing a call stack
# if we get a crash.
export APP_ARGS
${timeout_cmd} ${GDB_ROOT}/bin/gdb -return-child-result -quiet -batch \
-ex "set env MALLOC_CHECK_=3" \
-ex "set print thread-events off" \
-ex run \
-ex "thread apply all backtrace full" \
-ex "quit" \
--args ${APP_PATH} ${APP_ARGS}
ulimit -a
corepat=$(cat /proc/sys/kernel/core_pattern)
if [[ ${corepat} =~ ^[:space:]*\| ]] ; then
echo "WARNING: core pattern is piping - can't search for core files"
look_core=false
else
${timeout_cmd} ${APP_PATH} ${APP_ARGS}
look_core=true
coredir=$(dirname ${corepat})
fi
if [[ ${look_core} == true ]]; then
before=$(ls -A1 ${coredir})
fi
set +e
echo "Running tests for ${APP_PATH}"
if [[ ${MANUAL_TESTS:-} == true && ${PARALLEL_TESTS:-} != true ]]; then
for t in "${manual_tests[@]}" ; do
${APP_PATH} --unittest=${t}
TEST_STAT=$?
if [[ $TEST_STAT -ne 0 ]] ; then
break
fi
done
else
${APP_PATH} ${APP_ARGS}
TEST_STAT=$?
fi
set -e
if [[ ${look_core} == true ]]; then
after=$(ls -A1 ${coredir})
oIFS="${IFS}"
IFS=$'\n\r'
found_core=false
for l in $(diff -w --suppress-common-lines <(echo "$before") <(echo "$after")) ; do
if [[ "$l" =~ ^[[:space:]]*\>[[:space:]]*(.+)$ ]] ; then
corefile="${BASH_REMATCH[1]}"
echo "FOUND core dump file at '${coredir}/${corefile}'"
gdb_output=$(/bin/mktemp /tmp/gdb_output_XXXXXXXXXX.txt)
found_core=true
gdb \
-ex "set height 0" \
-ex "set logging file ${gdb_output}" \
-ex "set logging on" \
-ex "print 'ripple::BuildInfo::versionString'" \
-ex "thread apply all backtrace full" \
-ex "info inferiors" \
-ex quit \
"$APP_PATH" \
"${coredir}/${corefile}" &> /dev/null
echo -e "CORE INFO: \n\n $(cat ${gdb_output}) \n\n)"
fi
done
IFS="${oIFS}"
fi
if [[ ${found_core} == true ]]; then
exit -1
else
exit $TEST_STAT
fi

View File

@@ -0,0 +1,36 @@
#!/usr/bin/env bash
# run our build script in a docker container
# using travis-ci hosts
set -eux
function join_by { local IFS="$1"; shift; echo "$*"; }
set +x
echo "VERBOSE_BUILD=true" > /tmp/co.env
matchers=(
'TRAVIS.*' 'CI' 'CC' 'CXX'
'BUILD_TYPE' 'TARGET' 'MAX_TIME'
'CODECOV.+' 'CMAKE.*' '.+_TESTS'
'.+_OPTIONS' 'NINJA.*' 'NUM_.+'
'NIH_.+' 'BOOST.*' '.*CCACHE.*')
matchstring=$(join_by '|' "${matchers[@]}")
echo "MATCHSTRING IS:: $matchstring"
env | grep -E "^(${matchstring})=" >> /tmp/co.env
set -x
# need to eliminate TRAVIS_CMD...don't want to pass it to the container
cat /tmp/co.env | grep -v TRAVIS_CMD > /tmp/co.env.2
mv /tmp/co.env.2 /tmp/co.env
cat /tmp/co.env
mkdir -p -m 0777 ${TRAVIS_BUILD_DIR}/cores
echo "${TRAVIS_BUILD_DIR}/cores/%e.%p" | sudo tee /proc/sys/kernel/core_pattern
docker run \
-t --env-file /tmp/co.env \
-v ${TRAVIS_HOME}:${TRAVIS_HOME} \
-w ${TRAVIS_BUILD_DIR} \
--cap-add SYS_PTRACE \
--ulimit "core=-1" \
$DOCKER_IMAGE \
/bin/bash -c 'if [[ $CC =~ ([[:alpha:]]+)-([[:digit:].]+) ]] ; then sudo update-alternatives --set ${BASH_REMATCH[1]} /usr/bin/$CC; fi; bin/ci/ubuntu/build-and-test.sh'

View File

@@ -1,29 +0,0 @@
#!/bin/bash -u
# Exit if anything fails. Echo commands to aid debugging.
set -ex
# Target working dir - defaults to current dir.
# Can be set from caller, or in the first parameter
TWD=$( cd ${TWD:-${1:-${PWD:-$( pwd )}}}; pwd )
echo "Target path is: $TWD"
# Override gcc version to $GCC_VER.
# Put an appropriate symlink at the front of the path.
mkdir -pv $HOME/bin
for g in gcc g++ gcov gcc-ar gcc-nm gcc-ranlib
do
test -x $( type -p ${g}-$GCC_VER )
ln -sv $(type -p ${g}-$GCC_VER) $HOME/bin/${g}
done
# What versions are we ACTUALLY running?
if [ -x $HOME/bin/g++ ]; then
$HOME/bin/g++ -v
else
g++ -v
fi
pip install --user requests==2.13.0
pip install --user https://github.com/codecov/codecov-python/archive/master.zip
bash bin/sh/install-boost.sh

View File

@@ -0,0 +1,29 @@
#!/usr/bin/env bash
# some cached files create churn, so save them here for
# later restoration before packing the cache
set -eux
pushd ${TRAVIS_HOME}
if [ -f cache_ignore.tar ] ; then
rm -f cache_ignore.tar
fi
if [ -d _cache/nih_c ] ; then
find _cache/nih_c -name "build.ninja" | tar rf cache_ignore.tar --files-from -
find _cache/nih_c -name ".ninja_deps" | tar rf cache_ignore.tar --files-from -
find _cache/nih_c -name ".ninja_log" | tar rf cache_ignore.tar --files-from -
find _cache/nih_c -name "*.log" | tar rf cache_ignore.tar --files-from -
find _cache/nih_c -name "*.tlog" | tar rf cache_ignore.tar --files-from -
# show .a files in the cache, for sanity checking
find _cache/nih_c -name "*.a" -ls
fi
if [ -d _cache/ccache ] ; then
find _cache/ccache -name "stats" | tar rf cache_ignore.tar --files-from -
fi
if [ -f cache_ignore.tar ] ; then
tar -tf cache_ignore.tar
fi
popd

View File

@@ -1,86 +0,0 @@
#!/usr/bin/env bash
rippled_exe=/opt/ripple/bin/rippled
conf_file=/etc/opt/ripple/rippled.cfg
while getopts ":e:c:" opt; do
case $opt in
e)
rippled_exe=${OPTARG}
;;
c)
conf_file=${OPTARG}
;;
\?)
echo "Invalid option: -$OPTARG"
esac
done
tmp_loc=$(mktemp -d --tmpdir ripple_info.XXXX)
cd /tmp
chmod 751 ripple_info.*
cd ~
echo ${tmp_loc}
cleaned_conf=${tmp_loc}/cleaned_rippled_cfg.txt
if [[ -f ${conf_file} ]]
then
db=$(sed -r -e 's/\<s[a-zA-Z0-9]{28}\>/secretsecretsecretsecretmaybe/g' ${conf_file} |\
awk -v OUT_FILE=${cleaned_conf} '
BEGIN {skip=0; db_path="";print > OUT_FILE}
/^\[validation_seed\]/ {skip=1; next}
/^\[node_seed\]/ {skip=1; next}
/^\[validation_manifest\]/ {skip=1; next}
/^\[validator_token\]/ {skip=1; next}
/^\[.*\]/ {skip=0}
skip==1 {next}
save==1 {save=0;db_path=$0}
/^\[database_path\]/ {save=1}
{print >> OUT_FILE}
END {print db_path}
')
fi
echo "database_path: ${db}"
df ${db} > ${tmp_loc}/db_path_df.txt
echo
# Send output from this script to a log file
## this captures any messages
## or errors from the script itself
log_file=${tmp_loc}/get_info.log
exec 3>&1 1>>${log_file} 2>&1
## Send all stdout files to /tmp
if [[ -x ${rippled_exe} ]]
then
pgrep rippled && \
${rippled_exe} --conf ${conf_file} \
-- server_info > ${tmp_loc}/server_info.txt
fi
df -h > ${tmp_loc}/free_disk_space.txt
cat /proc/meminfo > ${tmp_loc}/amount_mem.txt
cat /proc/swaps > ${tmp_loc}/swap_space.txt
ulimit -a > ${tmp_loc}/reported_current_limits.txt
for dev_path in $(df | awk '$1 ~ /^\/dev\// {print $1}'); do
# strip numbers from end and remove '/dev/'
dev=$(basename ${dev_path%%[0-9]})
if [[ "$(cat /sys/block/${dev}/queue/rotational)" = 0 ]]
then
echo "${dev} : SSD" >> ${tmp_loc}/is_ssd.txt
else
echo "${dev} : NO SSD" >> ${tmp_loc}/is_ssd.txt
fi
done
pushd ${tmp_loc}
tar -czvf info-package.tar.gz *.txt *.log
popd
echo "Use the following command on your local machine to download from your rippled instance: scp <remote_rippled_username>@<remote_host>:${tmp_loc}/info-package.tar.gz <path/to/local_machine/directory>"| tee /dev/fd/3

143
bin/getRippledInfo Executable file
View File

@@ -0,0 +1,143 @@
#!/usr/bin/env bash
rippled_exe=/opt/ripple/bin/rippled
conf_file=/etc/opt/ripple/rippled.cfg
while getopts ":e:c:" opt; do
case $opt in
e)
rippled_exe=${OPTARG}
;;
c)
conf_file=${OPTARG}
;;
\?)
echo "Invalid option: -$OPTARG"
exit -1
esac
done
tmp_loc=$(mktemp -d --tmpdir ripple_info.XXXXX)
chmod 751 ${tmp_loc}
awk_prog=${tmp_loc}/cfg.awk
summary_out=${tmp_loc}/rippled_info.md
printf "# rippled report info\n\n> generated at %s\n" "$(date -R)" > ${summary_out}
function log_section {
printf "\n## %s\n" "$*" >> ${summary_out}
while read -r l; do
echo " $l" >> ${summary_out}
done </dev/stdin
}
function join_by {
local IFS="$1"; shift; echo "$*";
}
if [[ -f ${conf_file} ]] ; then
exclude=( ips ips_fixed node_seed validation_seed validator_token )
cleaned_conf=${tmp_loc}/cleaned_rippled_cfg.txt
cat << 'EOP' >> ${awk_prog}
BEGIN {FS="[[:space:]]*=[[:space:]]*"; skip=0; db_path=""; print > OUT_FILE; split(exl,exa,"|")}
/^#/ {next}
save==2 && /^[[:space:]]*$/ {next}
/^\[.+\]$/ {
section=tolower(gensub(/^\[[[:space:]]*([a-zA-Z_]+)[[:space:]]*\]$/, "\\1", "g"))
skip = 0
for (i in exa) {
if (section == exa[i])
skip = 1
}
if (section == "database_path")
save = 1
}
skip==1 {next}
save==2 {save=0; db_path=$0}
save==1 {save=2}
$1 ~ /password/ {$0=$1"=<redacted>"}
{print >> OUT_FILE}
END {print db_path}
EOP
db=$(\
sed -r -e 's/\<s[[:alnum:]]{28}\>/<redactedsecret>/g;s/^[[:space:]]*//;s/[[:space:]]*$//' ${conf_file} |\
awk -v OUT_FILE=${cleaned_conf} -v exl="$(join_by '|' "${exclude[@]}")" -f ${awk_prog})
rm ${awk_prog}
cat ${cleaned_conf} | log_section "cleaned config file"
rm ${cleaned_conf}
echo "${db}" | log_section "database path"
df ${db} | log_section "df: database"
fi
# Send output from this script to a log file
## this captures any messages
## or errors from the script itself
log_file=${tmp_loc}/get_info.log
exec 3>&1 1>>${log_file} 2>&1
## Send all stdout files to /tmp
if [[ -x ${rippled_exe} ]] ; then
pgrep rippled && \
${rippled_exe} --conf ${conf_file} \
-- server_info | log_section "server info"
fi
cat /proc/meminfo | log_section "meminfo"
cat /proc/swaps | log_section "swap space"
ulimit -a | log_section "ulimit"
if command -v lshw >/dev/null 2>&1 ; then
lshw 2>/dev/null | log_section "hardware info"
else
lscpu > ${tmp_loc}/hw_info.txt
hwinfo >> ${tmp_loc}/hw_info.txt
lspci >> ${tmp_loc}/hw_info.txt
lsblk >> ${tmp_loc}/hw_info.txt
cat ${tmp_loc}/hw_info.txt | log_section "hardware info"
rm ${tmp_loc}/hw_info.txt
fi
if command -v iostat >/dev/null 2>&1 ; then
iostat -t -d -x 2 6 | log_section "iostat"
fi
df -h | log_section "free disk space"
drives=($(df | awk '$1 ~ /^\/dev\// {print $1}' | xargs -n 1 basename))
block_devs=($(ls /sys/block/))
for d in "${drives[@]}"; do
for dev in "${block_devs[@]}"; do
#echo "D: [$d], DEV: [$dev]"
if [[ $d =~ $dev ]]; then
# this file (if exists) has 0 for SSD and 1 for HDD
if [[ "$(cat /sys/block/${dev}/queue/rotational 2>/dev/null)" == 0 ]] ; then
echo "${d} : SSD" >> ${tmp_loc}/is_ssd.txt
else
echo "${d} : NO SSD" >> ${tmp_loc}/is_ssd.txt
fi
fi
done
done
if [[ -f ${tmp_loc}/is_ssd.txt ]] ; then
cat ${tmp_loc}/is_ssd.txt | log_section "SSD"
rm ${tmp_loc}/is_ssd.txt
fi
cat ${log_file} | log_section "script log"
cat << MSG | tee /dev/fd/3
####################################################
rippled info has been gathered. Please copy the
contents of ${summary_out}
to a github gist at https://gist.github.com/
PLEASE REVIEW THIS FILE FOR ANY SENSITIVE DATA
BEFORE POSTING! We have tried our best to omit
any sensitive information from this file, but you
should verify before posting.
####################################################
MSG

View File

@@ -1,33 +0,0 @@
#!/bin/sh
# Assumptions:
# 1) BOOST_ROOT and BOOST_URL are already defined,
# and contain valid values.
# 2) The last namepart of BOOST_ROOT matches the
# folder name internal to boost's .tar.gz
# When testing you can force a boost build by clearing travis caches:
# https://travis-ci.org/ripple/rippled/caches
set -e
if [ -x /usr/bin/time ] ; then
: ${TIME:="Duration: %E"}
export TIME
time=/usr/bin/time
else
time=
fi
if [ ! -d "$BOOST_ROOT/lib" ]
then
wget $BOOST_URL -O /tmp/boost.tar.gz
cd `dirname $BOOST_ROOT`
rm -fr ${BOOST_ROOT}
tar xzf /tmp/boost.tar.gz
cd $BOOST_ROOT && \
$time ./bootstrap.sh --prefix=$BOOST_ROOT && \
$time ./b2 cxxflags="-std=c++14" -j$((2*${NUM_PROCESSORS:-2})) &&\
$time ./b2 install
else
echo "Using cached boost at $BOOST_ROOT"
fi

51
bin/sh/install-vcpkg.sh Executable file
View File

@@ -0,0 +1,51 @@
#!/usr/bin/env bash
set -exu
: ${TRAVIS_BUILD_DIR:=""}
: ${VCPKG_DIR:=".vcpkg"}
export VCPKG_ROOT=${VCPKG_DIR}
: ${VCPKG_DEFAULT_TRIPLET:="x64-windows-static"}
export VCPKG_DEFAULT_TRIPLET
EXE="vcpkg"
if [[ -z ${COMSPEC:-} ]]; then
EXE="${EXE}.exe"
fi
if [[ -d "${VCPKG_DIR}" && -x "${VCPKG_DIR}/${EXE}" && -d "${VCPKG_DIR}/installed" ]] ; then
echo "Using cached vcpkg at ${VCPKG_DIR}"
${VCPKG_DIR}/${EXE} list
else
if [[ -d "${VCPKG_DIR}" ]] ; then
rm -rf "${VCPKG_DIR}"
fi
git clone --branch 2019.12 https://github.com/Microsoft/vcpkg.git ${VCPKG_DIR}
pushd ${VCPKG_DIR}
BSARGS=()
if [[ "$(uname)" == "Darwin" ]] ; then
BSARGS+=(--allowAppleClang)
fi
if [[ -z ${COMSPEC:-} ]]; then
chmod +x ./bootstrap-vcpkg.sh
time ./bootstrap-vcpkg.sh "${BSARGS[@]}"
else
time ./bootstrap-vcpkg.bat
fi
popd
fi
# TODO: bring boost in this way as well ?
# NOTE: can pin specific ports to a commit/version like this:
# git checkout <SOME COMMIT HASH> ports/boost
if [ $# -eq 0 ]; then
echo "No extra packages specified..."
PKGS=()
else
PKGS=( "$@" )
fi
for LIB in "${PKGS[@]}"; do
time ${VCPKG_DIR}/${EXE} --clean-after-build install ${LIB}
done

46
bin/sh/setup-msvc.sh Executable file
View File

@@ -0,0 +1,46 @@
# NOTE: must be sourced from a shell so it can export vars
cat << BATCH > ./getenv.bat
CALL %*
ENV
BATCH
while read line ; do
IFS='"' read x path arg <<<"${line}"
if [ -f "${path}" ] ; then
echo "FOUND: $path"
export VCINSTALLDIR=$(./getenv.bat "${path}" ${arg} | grep "^VCINSTALLDIR=" | sed -E "s/^VCINSTALLDIR=//g")
if [ "${VCINSTALLDIR}" != "" ] ; then
echo "USING ${VCINSTALLDIR}"
export LIB=$(./getenv.bat "${path}" ${arg} | grep "^LIB=" | sed -E "s/^LIB=//g")
export LIBPATH=$(./getenv.bat "${path}" ${arg} | grep "^LIBPATH=" | sed -E "s/^LIBPATH=//g")
export INCLUDE=$(./getenv.bat "${path}" ${arg} | grep "^INCLUDE=" | sed -E "s/^INCLUDE=//g")
ADDPATH=$(./getenv.bat "${path}" ${arg} | grep "^PATH=" | sed -E "s/^PATH=//g")
export PATH="${ADDPATH}:${PATH}"
break
fi
fi
done <<EOL
"C:/Program Files (x86)/Microsoft Visual Studio/2019/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64 -vcvars_ver=14.24
"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64 -vcvars_ver=14.24
"C:/Program Files (x86)/Microsoft Visual Studio/2017/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64 -vcvars_ver=14.24
"C:/Program Files (x86)/Microsoft Visual Studio/2017/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64 -vcvars_ver=14.24
"C:/Program Files (x86)/Microsoft Visual Studio/2019/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
"C:/Program Files (x86)/Microsoft Visual Studio/2017/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
"C:/Program Files (x86)/Microsoft Visual Studio/2017/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
"C:/Program Files (x86)/Microsoft Visual Studio 15.0/VC/vcvarsall.bat" amd64
"C:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/vcvarsall.bat" amd64
"C:/Program Files (x86)/Microsoft Visual Studio 13.0/VC/vcvarsall.bat" amd64
"C:/Program Files (x86)/Microsoft Visual Studio 12.0/VC/vcvarsall.bat" amd64
EOL
# TODO: update the list above as needed to support newer versions of msvc tools
# MSVC 19.25.28610.4 causes the rocksdb's compilation to fail, for VS2019, we will choose 14.24 VCTools for now
# TODO: Delete lines with -vcars_ver=14.24 once rocksdb becomes compatible with newer compiler version.
rm -f getenv.bat
if [ "${VCINSTALLDIR}" = "" ] ; then
echo "No compatible visual studio found!"
fi

View File

@@ -1,6 +1,5 @@
#-------------------------------------------------------------------------------
#
# Rippled Server Instance Configuration Example
#
#-------------------------------------------------------------------------------
#
@@ -369,29 +368,26 @@
#
# [ips]
#
# List of hostnames or ips where the Ripple protocol is served. For a starter
# list, you can either copy entries from: https://ripple.com/ripple.txt or if
# you prefer you can specify r.ripple.com 51235
# List of hostnames or ips where the Ripple protocol is served. A default
# starter list is included in the code and used if no other hostnames are
# available.
#
# One IPv4 address or domain names per line is allowed. A port may must be
# specified after adding a space to the address. By convention, if known,
# IPs are listed in from most to least trusted.
# One address or domain name per line is allowed. A port may must be
# specified after adding a space to the address. The ordering of entries
# does not generally matter.
#
# The default list of entries is:
# - r.ripple.com 51235
# - zaphod.alloy.ee 51235
# - sahyadri.isrdc.in 51235
#
# Examples:
# 192.168.0.1
# 192.168.0.1 3939
# r.ripple.com 51235
#
# This will give you a good, up-to-date list of addresses:
#
# [ips]
# 192.168.0.1
# 192.168.0.1 2459
# r.ripple.com 51235
#
# The default is:
# [ips_fixed] addresses (if present)
# or
# ( r.ripple.com 51235 , zaphod.alloy.ee 51235 )
#
#
# [ips_fixed]
#
@@ -401,8 +397,8 @@
# Ripple network through a public-facing server, or for building a set
# of cluster peers.
#
# One IPv4 address or domain names per line is allowed. A port must be
# specified after adding a space to the address.
# One address or domain names per line is allowed. A port must be specified
# after adding a space to the address.
#
#
#
@@ -737,6 +733,24 @@
# node is a validator.
#
#
#
# [network_id]
#
# Specify the network which this server is configured to connect to and
# track. If set, the server will not establish connections with servers
# that are explicitly configured to track another network.
#
# Network identifiers are usually unsigned integers in the range 0 to
# 4294967295 inclusive. The server also maps the following well-known
# names to the corresponding numerical identifier:
#
# main -> 0
# testnet -> 1
#
# If this value is not specified the server is not explicitly configured
# to track a particular network.
#
#
#-------------------------------------------------------------------------------
#
# 4. HTTPS Client
@@ -1038,7 +1052,7 @@
# [crawl]
#
# List of options to control what data is reported through the /crawl endpoint
# See https://developers.ripple.com/peer-protocol.html#peer-crawler
# See https://xrpl.org/peer-crawler.html
#
# <flag>
#
@@ -1079,6 +1093,16 @@
# counts = 0
# unl = 1
#
# [vl]
#
# Options to control what data is reported through the /vl endpoint
# See [...]
#
# enable = <flag>
#
# Enable or disable access to /vl requests. Default is '1' which
# enables access.
#
#-------------------------------------------------------------------------------
#
# 9. Example Settings
@@ -1154,6 +1178,10 @@ ip = 127.0.0.1
admin = 127.0.0.1
protocol = ws
#[port_grpc]
#port = 50051
#ip = 0.0.0.0
#[port_ws_public]
#port = 6005
#ip = 127.0.0.1

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

@@ -320,7 +320,7 @@ void
pipe::stream::
close()
{
std::lock_guard<std::mutex> lock{out_.m};
std::lock_guard lock{out_.m};
out_.eof = true;
if(out_.op)
out_.op.get()->operator()();

View File

@@ -120,7 +120,7 @@ spawn(F0&& f, FN&&... fn)
[&](yield_context yield)
{
f(yield);
std::lock_guard<std::mutex> lock{m_};
std::lock_guard lock{m_};
if(--running_ == 0)
cv_.notify_all();
}

View File

@@ -79,14 +79,6 @@ int main(int ac, char const* av[])
using namespace std;
using namespace beast::unit_test;
#if BOOST_MSVC
{
int flags = _CrtSetDbgFlag(_CRTDBG_REPORT_FLAG);
flags |= _CRTDBG_LEAK_CHECK_DF;
_CrtSetDbgFlag(flags);
}
#endif
namespace po = boost::program_options;
po::options_description desc("Options");
desc.add_options()

View File

@@ -110,7 +110,6 @@ selector::operator()(suite_info const& s)
case all:
default:
// fall through
break;
};

View File

@@ -237,7 +237,7 @@ template<class>
void
runner::testcase(std::string const& name)
{
std::lock_guard<std::recursive_mutex> lock(mutex_);
std::lock_guard lock(mutex_);
// Name may not be empty
BOOST_ASSERT(default_ || ! name.empty());
// Forgot to call pass or fail
@@ -253,7 +253,7 @@ template<class>
void
runner::pass()
{
std::lock_guard<std::recursive_mutex> lock(mutex_);
std::lock_guard lock(mutex_);
if(default_)
testcase("");
on_pass();
@@ -264,7 +264,7 @@ template<class>
void
runner::fail(std::string const& reason)
{
std::lock_guard<std::recursive_mutex> lock(mutex_);
std::lock_guard lock(mutex_);
if(default_)
testcase("");
on_fail(reason);
@@ -276,7 +276,7 @@ template<class>
void
runner::log(std::string const& s)
{
std::lock_guard<std::recursive_mutex> lock(mutex_);
std::lock_guard lock(mutex_);
if(default_)
testcase("");
on_log(s);

View File

@@ -32,7 +32,6 @@
#include <ripple/app/misc/TxQ.h>
#include <ripple/app/misc/ValidatorKeys.h>
#include <ripple/app/misc/ValidatorList.h>
#include <ripple/basics/make_lock.h>
#include <ripple/beast/core/LexicalCast.h>
#include <ripple/consensus/LedgerTiming.h>
#include <ripple/nodestore/DatabaseShard.h>
@@ -40,7 +39,9 @@
#include <ripple/overlay/predicates.h>
#include <ripple/protocol/Feature.h>
#include <ripple/protocol/digest.h>
#include <algorithm>
#include <mutex>
namespace ripple {
@@ -292,7 +293,7 @@ RCLConsensus::Adaptor::onClose(
auto initialLedger = app_.openLedger().current();
auto initialSet = std::make_shared<SHAMap>(
SHAMapType::TRANSACTION, app_.family(), SHAMap::version{1});
SHAMapType::TRANSACTION, app_.family());
initialSet->setUnbacked();
// Build SHAMap containing all transactions in our open ledger
@@ -551,22 +552,23 @@ RCLConsensus::Adaptor::doAccept(
// in the previous consensus round.
//
bool anyDisputes = false;
for (auto& it : result.disputes)
for (auto const& [_, dispute] : result.disputes)
{
if (!it.second.getOurVote())
(void)_;
if (!dispute.getOurVote())
{
// we voted NO
try
{
JLOG(j_.debug())
<< "Test applying disputed transaction that did"
<< " not get in " << it.second.tx().id();
<< " not get in " << dispute.tx().id();
SerialIter sit(it.second.tx().tx_.slice());
SerialIter sit(dispute.tx().tx_.slice());
auto txn = std::make_shared<STTx const>(sit);
// Disputed pseudo-transactions that were not accepted
// can't be succesfully applied in the next ledger
// can't be successfully applied in the next ledger
if (isPseudoTx(*txn))
continue;
@@ -583,8 +585,8 @@ RCLConsensus::Adaptor::doAccept(
}
// Build new open ledger
auto lock = make_lock(app_.getMasterMutex(), std::defer_lock);
auto sl = make_lock(ledgerMaster_.peekMutex(), std::defer_lock);
std::unique_lock lock{app_.getMasterMutex(), std::defer_lock};
std::unique_lock sl{ledgerMaster_.peekMutex(), std::defer_lock};
std::lock(lock, sl);
auto const lastVal = ledgerMaster_.getValidatedLedger();
@@ -637,16 +639,14 @@ RCLConsensus::Adaptor::doAccept(
std::chrono::duration_cast<usec64_t>(closeTime.time_since_epoch());
int closeCount = 1;
for (auto const& p : rawCloseTimes.peers)
for (auto const& [t, v] : rawCloseTimes.peers)
{
// FIXME: Use median, not average
JLOG(j_.info())
<< std::to_string(p.second) << " time votes for "
<< std::to_string(p.first.time_since_epoch().count());
closeCount += p.second;
closeTotal += std::chrono::duration_cast<usec64_t>(
p.first.time_since_epoch()) *
p.second;
<< std::to_string(v) << " time votes for "
<< std::to_string(t.time_since_epoch().count());
closeCount += v;
closeTotal +=
std::chrono::duration_cast<usec64_t>(t.time_since_epoch()) * v;
}
closeTotal += usec64_t(closeCount / 2); // for round to nearest
@@ -812,7 +812,7 @@ RCLConsensus::getJson(bool full) const
{
Json::Value ret;
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
ret = consensus_.getJson(full);
}
ret["validating"] = adaptor_.validating();
@@ -824,7 +824,7 @@ RCLConsensus::timerEntry(NetClock::time_point const& now)
{
try
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
consensus_.timerEntry(now);
}
catch (SHAMapMissingNode const& mn)
@@ -840,7 +840,7 @@ RCLConsensus::gotTxSet(NetClock::time_point const& now, RCLTxSet const& txSet)
{
try
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
consensus_.gotTxSet(now, txSet);
}
catch (SHAMapMissingNode const& mn)
@@ -859,7 +859,7 @@ RCLConsensus::simulate(
NetClock::time_point const& now,
boost::optional<std::chrono::milliseconds> consensusDelay)
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
consensus_.simulate(now, consensusDelay);
}
@@ -868,7 +868,7 @@ RCLConsensus::peerProposal(
NetClock::time_point const& now,
RCLCxPeerPos const& newProposal)
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
return consensus_.peerProposal(now, newProposal);
}
@@ -895,7 +895,8 @@ RCLConsensus::Adaptor::preStartRound(RCLCxLedger const & prevLgr)
}
}
const bool synced = app_.getOPs().getOperatingMode() == NetworkOPs::omFULL;
const bool synced = app_.getOPs().getOperatingMode() ==
OperatingMode::FULL;
if (validating_)
{
@@ -950,6 +951,13 @@ RCLConsensus::Adaptor::validator() const
return !valPublic_.empty();
}
void
RCLConsensus::Adaptor::updateOperatingMode(std::size_t const positions) const
{
if (! positions && app_.getOPs().isFull())
app_.getOPs().setMode(OperatingMode::CONNECTED);
}
void
RCLConsensus::startRound(
NetClock::time_point const& now,
@@ -957,7 +965,7 @@ RCLConsensus::startRound(
RCLCxLedger const& prevLgr,
hash_set<NodeID> const& nowUntrusted)
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
consensus_.startRound(
now, prevLgrId, prevLgr, nowUntrusted, adaptor_.preStartRound(prevLgr));
}

View File

@@ -59,7 +59,7 @@ class RCLConsensus
LedgerMaster& ledgerMaster_;
LocalTxs& localTxs_;
InboundTransactions& inboundTransactions_;
beast::Journal j_;
beast::Journal const j_;
NodeID const nodeID_;
PublicKey const valPublic_;
@@ -152,6 +152,15 @@ class RCLConsensus
bool
validator() const;
/** Update operating mode based on current peer positions.
*
* If our current ledger has no agreement from the network,
* then we cannot be in the omFULL mode.
*
* @param positions Number of current peer positions.
*/
void updateOperatingMode(std::size_t const positions) const;
/** Consensus simulation parameters
*/
ConsensusParms const&
@@ -164,7 +173,7 @@ class RCLConsensus
//---------------------------------------------------------------------
// The following members implement the generic Consensus requirements
// and are marked private to indicate ONLY Consensus<Adaptor> will call
// them (via friendship). Since they are called only from Consenus<Adaptor>
// them (via friendship). Since they are called only from Consensus<Adaptor>
// methods and since RCLConsensus::consensus_ should only be accessed
// under lock, these will only be called under lock.
//
@@ -216,7 +225,7 @@ class RCLConsensus
bool
hasOpenTransactions() const;
/** Number of proposers that have vallidated the given ledger
/** Number of proposers that have validated the given ledger
@param h The hash of the ledger of interest
@return the number of proposers that validated a ledger
@@ -328,7 +337,7 @@ class RCLConsensus
@param ne Event type for notification
@param ledger The ledger at the time of the state change
@param haveCorrectLCL Whether we believ we have the correct LCL.
@param haveCorrectLCL Whether we believe we have the correct LCL.
*/
void
notify(
@@ -365,7 +374,7 @@ class RCLConsensus
@param closeTimeCorrect Whether consensus agreed on close time
@param closeResolution Resolution used to determine consensus close
time
@param roundTime Duration of this consensus rorund
@param roundTime Duration of this consensus round
@param failedTxs Populate with transactions that we could not
successfully apply.
@return The newly built ledger
@@ -380,7 +389,7 @@ class RCLConsensus
std::chrono::milliseconds roundTime,
std::set<TxID>& failedTxs);
/** Validate the given ledger and share with peers as necessary
/** Validate the given ledger and share with peers as necessary
@param ledger The ledger to validate
@param txns The consensus transaction set
@@ -446,6 +455,12 @@ public:
return adaptor_.mode();
}
ConsensusPhase
phase() const
{
return consensus_.phase();
}
//! @see Consensus::getJson
Json::Value
getJson(bool full) const;
@@ -470,7 +485,7 @@ public:
RCLCxLedger::ID
prevLedgerID() const
{
ScopedLockType _{mutex_};
std::lock_guard _{mutex_};
return consensus_.prevLedgerID();
}
@@ -497,11 +512,10 @@ private:
// guards all calls to consensus_. adaptor_ uses atomics internally
// to allow concurrent access of its data members that have getters.
mutable std::recursive_mutex mutex_;
using ScopedLockType = std::lock_guard <std::recursive_mutex>;
Adaptor adaptor_;
Consensus<Adaptor> consensus_;
beast::Journal j_;
beast::Journal const j_;
};
}

View File

@@ -115,7 +115,7 @@ public:
assert(map_);
}
/** Constructor from a previosly created MutableTxSet
/** Constructor from a previously created MutableTxSet
@param m MutableTxSet that will become fixed
*/
@@ -142,8 +142,8 @@ public:
@note Since find may not succeed, this returns a
`std::shared_ptr<const SHAMapItem>` rather than a Tx, which
cannot refer to a missing transaction. The generic consensus
code use the shared_ptr semantics to know whether the find
was succesfully and properly creates a Tx as needed.
code uses the shared_ptr semantics to know whether the find
was successful and properly creates a Tx as needed.
*/
std::shared_ptr<const SHAMapItem> const&
find(Tx::ID const& entry) const
@@ -176,13 +176,13 @@ public:
map_->compare(*(j.map_), delta, 65536);
std::map<uint256, bool> ret;
for (auto const& item : delta)
for (auto const& [k, v] : delta)
{
assert(
(item.second.first && !item.second.second) ||
(item.second.second && !item.second.first));
(v.first && !v.second) ||
(v.second && !v.first));
ret[item.first] = static_cast<bool>(item.second.first);
ret[k] = static_cast<bool>(v.first);
}
return ret;
}

View File

@@ -114,7 +114,6 @@ mismatch(RCLValidatedLedger const& a, RCLValidatedLedger const& b)
RCLValidationsAdaptor::RCLValidationsAdaptor(Application& app, beast::Journal j)
: app_(app), j_(j)
{
staleValidations_.reserve(512);
}
NetClock::time_point
@@ -148,131 +147,6 @@ RCLValidationsAdaptor::acquire(LedgerHash const & hash)
return RCLValidatedLedger(std::move(ledger), j_);
}
void
RCLValidationsAdaptor::onStale(RCLValidation&& v)
{
// Store the newly stale validation; do not do significant work in this
// function since this is a callback from Validations, which may be
// doing other work.
ScopedLockType sl(staleLock_);
staleValidations_.emplace_back(std::move(v));
if (staleWriting_)
return;
// addJob() may return false (Job not added) at shutdown.
staleWriting_ = app_.getJobQueue().addJob(
jtWRITE, "Validations::doStaleWrite", [this](Job&) {
auto event =
app_.getJobQueue().makeLoadEvent(jtDISK, "ValidationWrite");
ScopedLockType sl(staleLock_);
doStaleWrite(sl);
});
}
void
RCLValidationsAdaptor::flush(hash_map<NodeID, RCLValidation>&& remaining)
{
bool anyNew = false;
{
ScopedLockType sl(staleLock_);
for (auto const& keyVal : remaining)
{
staleValidations_.emplace_back(std::move(keyVal.second));
anyNew = true;
}
// If we have new validations to write and there isn't a write in
// progress already, then write to the database synchronously.
if (anyNew && !staleWriting_)
{
staleWriting_ = true;
doStaleWrite(sl);
}
// In the case when a prior asynchronous doStaleWrite was scheduled,
// this loop will block until all validations have been flushed.
// This ensures that all validations are written upon return from
// this function.
while (staleWriting_)
{
ScopedUnlockType sul(staleLock_);
std::this_thread::sleep_for(std::chrono::milliseconds(100));
}
}
}
// NOTE: doStaleWrite() must be called with staleLock_ *locked*. The passed
// ScopedLockType& acts as a reminder to future maintainers.
void
RCLValidationsAdaptor::doStaleWrite(ScopedLockType&)
{
static const std::string insVal(
"INSERT INTO Validations "
"(InitialSeq, LedgerSeq, LedgerHash,NodePubKey,SignTime,RawData) "
"VALUES (:initialSeq, :ledgerSeq, "
":ledgerHash,:nodePubKey,:signTime,:rawData);");
static const std::string findSeq(
"SELECT LedgerSeq FROM Ledgers WHERE Ledgerhash=:ledgerHash;");
assert(staleWriting_);
while (!staleValidations_.empty())
{
std::vector<RCLValidation> currentStale;
currentStale.reserve(512);
staleValidations_.swap(currentStale);
{
ScopedUnlockType sul(staleLock_);
{
auto db = app_.getLedgerDB().checkoutDb();
Serializer s(1024);
soci::transaction tr(*db);
for (RCLValidation const& wValidation : currentStale)
{
// Only save full validations until we update the schema
if(!wValidation.full())
continue;
s.erase();
STValidation::pointer const& val = wValidation.unwrap();
val->add(s);
auto const ledgerHash = to_string(val->getLedgerHash());
boost::optional<std::uint64_t> ledgerSeq;
*db << findSeq, soci::use(ledgerHash),
soci::into(ledgerSeq);
auto const initialSeq = ledgerSeq.value_or(
app_.getLedgerMaster().getCurrentLedgerIndex());
auto const nodePubKey = toBase58(
TokenType::NodePublic, val->getSignerPublic());
auto const signTime =
val->getSignTime().time_since_epoch().count();
soci::blob rawData(*db);
rawData.append(
reinterpret_cast<const char*>(s.peekData().data()),
s.peekData().size());
assert(rawData.get_len() == s.peekData().size());
*db << insVal, soci::use(initialSeq), soci::use(ledgerSeq),
soci::use(ledgerHash), soci::use(nodePubKey),
soci::use(signTime), soci::use(rawData);
}
tr.commit();
}
}
}
staleWriting_ = false;
}
bool
handleNewValidation(Application& app,
STValidation::ref val,
@@ -293,7 +167,7 @@ handleNewValidation(Application& app,
bool shouldRelay = false;
RCLValidations& validations = app.getValidations();
beast::Journal j = validations.adaptor().journal();
beast::Journal const j = validations.adaptor().journal();
auto dmp = [&](beast::Journal::Stream s, std::string const& msg) {
s << "Val for " << hash
@@ -324,7 +198,7 @@ handleNewValidation(Application& app,
{
auto const seq = val->getFieldU32(sfLedgerSequence);
dmp(j.warn(),
"already validated sequence at or past " + to_string(seq));
"already validated sequence at or past " + std::to_string(seq));
}
if (val->isTrusted() && outcome == ValStatus::current)

View File

@@ -21,7 +21,6 @@
#define RIPPLE_APP_CONSENSUSS_VALIDATIONS_H_INCLUDED
#include <ripple/app/ledger/Ledger.h>
#include <ripple/basics/ScopedLock.h>
#include <ripple/consensus/Validations.h>
#include <ripple/protocol/Protocol.h>
#include <ripple/protocol/RippleLedgerHash.h>
@@ -211,24 +210,6 @@ public:
NetClock::time_point
now() const;
/** Handle a newly stale validation.
@param v The newly stale validation
@warning This should do minimal work, as it is expected to be called
by the generic Validations code while it may be holding an
internal lock
*/
void
onStale(RCLValidation&& v);
/** Flush current validations to disk before shutdown.
@param remaining The remaining validations to flush
*/
void
flush(hash_map<NodeID, RCLValidation>&& remaining);
/** Attempt to acquire the ledger with given id from the network */
boost::optional<RCLValidatedLedger>
acquire(LedgerHash const & id);
@@ -240,22 +221,8 @@ public:
}
private:
using ScopedLockType = std::lock_guard<Mutex>;
using ScopedUnlockType = GenericScopedUnlock<Mutex>;
Application& app_;
beast::Journal j_;
// Lock for managing staleValidations_ and writing_
std::mutex staleLock_;
std::vector<RCLValidation> staleValidations_;
bool staleWriting_ = false;
// Write the stale validations to sqlite DB, the scoped lock argument
// is used to remind callers that the staleLock_ must be *locked* prior
// to making the call
void
doStaleWrite(ScopedLockType&);
};
/// Alias for RCL-specific instantiation of generic Validations

Some files were not shown because too many files have changed in this diff Show More