Compare commits

...

154 Commits

Author SHA1 Message Date
yinyiqian1
2f25cb60c7 Merge branch 'develop' into account_mpts 2025-10-28 11:23:32 -04:00
Ayaz Salikhov
8c8a7ff3b8 ci: Use XRPLF/get-nproc (#2727) 2025-10-27 18:50:40 +00:00
Ayaz Salikhov
16493abd0d ci: Better pre-commit failure message (#2720) 2025-10-27 12:14:56 +00:00
dependabot[bot]
3dd72d94e1 ci: [DEPENDABOT] bump actions/download-artifact from 5.0.0 to 6.0.0 (#2723)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-27 10:17:41 +00:00
dependabot[bot]
5e914abf29 ci: [DEPENDABOT] bump actions/upload-artifact from 4.6.2 to 5.0.0 in /.github/actions/code-coverage (#2725)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-27 10:16:14 +00:00
dependabot[bot]
9603968808 ci: [DEPENDABOT] bump actions/upload-artifact from 4.6.2 to 5.0.0 (#2722)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-27 10:15:39 +00:00
Ayaz Salikhov
0124c06a53 ci: Enable clang asan builds (#2717) 2025-10-24 13:36:14 +01:00
Ayaz Salikhov
890279b15e Merge branch 'develop' into account_mpts 2025-10-24 08:53:50 +01:00
yinyiqian1
022f2a44b4 resolve new comments 2025-10-23 23:55:14 -04:00
Ayaz Salikhov
1bfdd0dd89 ci: Save full logs for failed sanitized tests (#2715) 2025-10-23 17:06:02 +01:00
Alex Kremer
f41d574204 fix: Flaky DeadlineIsHandledCorrectly (#2716) 2025-10-23 16:33:58 +01:00
Ayaz Salikhov
d0ec60381b ci: Use intermediate environment variables for improved security (#2713) 2025-10-23 11:34:53 +01:00
Ayaz Salikhov
0b19a42a96 ci: Pin all GitHub actions (#2712) 2025-10-22 16:17:15 +01:00
emrearıyürek
030f4f1b22 docs: Remove logging.md from readme (#2710) 2025-10-22 11:36:49 +01:00
yinyiqian1
187dd4ae90 fix doxygen 2025-10-21 10:35:14 -04:00
yinyiqian1
2c3e4dec74 Merge branch 'develop' into account_mpts 2025-10-20 21:13:29 -04:00
github-actions[bot]
2de49b4d33 style: clang-tidy auto fixes (#2706) 2025-10-20 10:41:59 +01:00
Ayaz Salikhov
3de2bf2910 ci: Update pre-commit workflow to latest version (#2702) 2025-10-18 07:55:14 +01:00
yinyiqian1
f2a40726c7 Merge branch 'develop' into account_mpts 2025-10-17 15:48:22 -04:00
Peter Chen
7538efb01e fix: Add mpt_issuance_id to meta of MPTIssuanceCreate (#2701)
fixes #2332
2025-10-17 09:58:38 -04:00
yinyiqian1
e8a64d838c Merge branch 'develop' into account_mpts 2025-10-16 14:27:13 -04:00
Alex Kremer
685f611434 chore: Disable flaky DisconnectClientOnInactivity (#2699) 2025-10-16 18:55:58 +01:00
yinyiqian1
c351ace6ec Merge branch 'develop' into account_mpts 2025-10-16 12:16:16 -04:00
Ayaz Salikhov
2528dee6b6 chore: Use pre-commit image with libatomic (#2698) 2025-10-16 13:03:52 +01:00
Ayaz Salikhov
b2be4b51d1 ci: Add libatomic as dependency for pre-commit image (#2697) 2025-10-16 12:02:14 +01:00
yinyiqian1
05dd0b3412 Merge branch 'develop' into account_mpts 2025-10-15 13:49:34 -04:00
yinyiqian1
685fab5d17 resolve comments 2025-10-15 13:18:09 -04:00
Alex Kremer
b4e40558c9 fix: Address AmendmentBlockHandler flakiness in old ETL tests (#2694)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-10-15 17:15:12 +01:00
emrearıyürek
b361e3a108 feat: Support new types in ledger_entry (#2654)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-10-14 17:37:14 +01:00
Ayaz Salikhov
a4b47da57a docs: Update doxygen-awesome-css to 2.4.1 (#2690) 2025-10-13 18:55:21 +01:00
github-actions[bot]
2ed1a45ef1 style: clang-tidy auto fixes (#2688)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-10-10 10:45:47 +01:00
Ayaz Salikhov
dabaa5bf80 fix: Drop dynamic loggers to fix memory leak (#2686) 2025-10-09 16:51:55 +01:00
Ayaz Salikhov
b4fb3e42b8 chore: Publish RCs as non-draft (#2685) 2025-10-09 14:48:29 +01:00
Peter Chen
aa64bb7b6b refactor: Keyspace comments (#2684)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-10-08 19:58:05 +01:00
Ayaz Salikhov
79fc0b10c3 Merge branch 'develop' into account_mpts 2025-10-08 17:46:31 +01:00
rrmanukyan
dc5f8b9c23 fix: Add gRPC Timeout and keepalive to handle stuck connections (#2676) 2025-10-08 13:50:11 +01:00
yinyiqian1
676f7f0d6a feat: Support account MPToken 2025-10-07 19:21:53 -04:00
Ayaz Salikhov
7300529484 docs: All files are .hpp (#2683) 2025-10-07 19:28:00 +01:00
Ayaz Salikhov
33802f475f docs: Build docs using doxygen 1.14.0 (#2681) 2025-10-07 18:48:19 +01:00
Ayaz Salikhov
213752862c chore: Install Doxygen 1.14.0 in our images (#2682) 2025-10-07 18:20:24 +01:00
Ayaz Salikhov
a189eeb952 ci: Use separate pre-commit image (#2678) 2025-10-07 16:01:46 +01:00
Ayaz Salikhov
3c1811233a chore: Add git to pre-commit image (#2679) 2025-10-07 15:41:19 +01:00
Alex Kremer
693ed2061c fix: ASAN issue with AmendmentBlockHandler test (#2674)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-10-07 15:18:01 +01:00
Ayaz Salikhov
1e2f4b5ca2 chore: Add separate pre-commit image (#2677) 2025-10-07 14:23:49 +01:00
Ayaz Salikhov
1da8464d75 style: Rename actions to use dash (#2669) 2025-10-06 16:19:27 +01:00
Ayaz Salikhov
d48fb168c6 ci: Allow PR titles to start with [ (#2668) 2025-10-06 15:20:26 +01:00
dependabot[bot]
92595f95a0 ci: [DEPENDABOT] bump docker/login-action from 3.5.0 to 3.6.0 (#2662)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 15:16:15 +01:00
dependabot[bot]
fc9de87136 ci: [DEPENDABOT] bump docker/login-action from 3.5.0 to 3.6.0 in /.github/actions/build_docker_image (#2663)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-10-06 15:16:06 +01:00
Ayaz Salikhov
67f5ca445f style: Rename workflows to use dash and show reusable (#2667) 2025-10-06 15:02:17 +01:00
Ayaz Salikhov
897c255b8c chore: Start pr title with uppercase (#2666) 2025-10-06 13:54:11 +01:00
github-actions[bot]
aa9eea0d99 style: clang-tidy auto fixes (#2665) 2025-10-06 10:35:33 +01:00
Peter Chen
1cfa06c9aa feat: Support Keyspace (#2454)
Support AWS Keyspace queries

---------

Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
2025-10-03 11:28:50 -04:00
Ayaz Salikhov
2b937bf098 style: Update pre-commit hooks (#2658) 2025-10-01 17:20:02 +01:00
Peter Chen
c6f2197878 chore: Update libXRPL to 2.6.1 (#2656) 2025-10-01 13:17:41 +01:00
Peter Chen
48855633b5 chore: Update libxrpl to 2.6.1-rc2 (#2652)
We'll still need to update to `xrpl/2.6.1` when it is out, but let's
first update it to merge the other PR that depends on xrpl changes
2025-09-29 14:27:26 -04:00
github-actions[bot]
c26df5e1fb style: clang-tidy auto fixes (#2651)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-09-29 18:11:02 +02:00
emrearıyürek
9fb26d77ed chore: Remove redundant headers and include CassandraDBHelper.h (#2649) 2025-09-26 10:53:04 +01:00
Ayaz Salikhov
c3c8b2d796 docs: Update doxygen-awesome-css to 2.4.0 (#2647) 2025-09-24 12:32:13 +02:00
github-actions[bot]
2f3e9498dc style: clang-tidy auto fixes (#2645) 2025-09-23 11:31:26 +02:00
emrearıyürek
d2de240389 fix: Print out error details of web context (#2351) 2025-09-22 13:45:04 +01:00
Alex Kremer
245a808e4b feat: Cache state endpoint (#2642) 2025-09-18 16:20:17 +01:00
Ayaz Salikhov
586a2116a4 test: Enable address sanitizer testing (#2628) 2025-09-18 16:08:02 +01:00
Ayaz Salikhov
93919ab8b7 ci: Move clang-tidy_on_fix_merged workflow into a main one (#2641) 2025-09-18 14:55:59 +01:00
Ayaz Salikhov
e92a63f4e5 ci: Make separate download/upload options for ccache (#2637) 2025-09-18 13:46:34 +01:00
github-actions[bot]
5e5dd649cf style: clang-tidy auto fixes (#2639) 2025-09-18 10:32:06 +01:00
Peter Chen
fb1cdcbde5 fix: mpt_issuance_id not in tx for MPTIssuanceCreate (#2630)
fixes: #2332
2025-09-17 16:17:58 +01:00
Alex Kremer
b66d13bc74 fix: Workaround large number validation and parsing (#2608)
Fixes #2586
2025-09-17 16:12:32 +01:00
Bart
e78ff5c442 ci: Wrap GitHub CI conditionals in curly braces (#2629)
Co-authored-by: Bart Thomee <11445373+bthomee@users.noreply.github.com>
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-09-16 16:13:03 +01:00
Ayaz Salikhov
c499f9d679 test: Run LoggerFixture::init in integration tests (#2636) 2025-09-16 15:27:03 +01:00
github-actions[bot]
420b99cfa1 style: clang-tidy auto fixes (#2635)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-09-16 14:14:55 +01:00
Ayaz Salikhov
3f2ada3439 fix: Keep spdlog loggers valid between tests (#2614) 2025-09-15 14:47:35 +01:00
emrearıyürek
e996f2b7ab refactor: Use expected<void, error> instead of optional<error> (#2565)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-09-15 13:54:09 +01:00
Ayaz Salikhov
26112d17f8 ci: Run only 10 instances of upload-conan-deps using max-parallel (#2617) 2025-09-15 13:52:37 +01:00
dependabot[bot]
e4abec4b98 ci: [DEPENDABOT] bump tj-actions/changed-files from 46.0.5 to 47.0.0 (#2619) 2025-09-15 12:14:38 +01:00
Ayaz Salikhov
503e23055b docs: Update conan lockfile docs to work on all machines the same way (#2607) 2025-09-12 14:32:14 +01:00
github-actions[bot]
97480ce626 style: clang-tidy auto fixes (#2606) 2025-09-12 11:08:44 +01:00
Alex Kremer
bd966e636e fix: ETLng task manager stops queue on run (#2585) 2025-09-11 14:02:05 +01:00
Peter Chen
91b248e3b2 fix: remove MPTIssuanceID from tx (#2569)
fixes #2332 : Essentially, remove `mpt_issuance_id` for
`MPTokenIssuanceCreate`
2025-09-10 08:38:03 -07:00
github-actions[bot]
140ac78e15 style: clang-tidy auto fixes (#2574)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-09-10 10:31:11 +01:00
Ayaz Salikhov
f1bf423f69 fix: Remove global static loggers (#2557) 2025-09-08 22:50:08 +01:00
emrearıyürek
dcf369e4ec docs: Fix some doxygen issues in LedgerFeed (#2556) 2025-09-08 20:33:53 +01:00
Ayaz Salikhov
56f4dc591c chore: Remove all conan options and use purely CMake targets (#2538)
Closes: https://github.com/XRPLF/clio/pull/2535
2025-09-08 12:22:36 +01:00
dependabot[bot]
c40cd8154f ci: [DEPENDABOT] bump codecov/codecov-action from 5.5.0 to 5.5.1 (#2548)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-08 11:17:03 +01:00
github-actions[bot]
989a0c8468 style: clang-tidy auto fixes (#2552)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-09-08 11:03:58 +01:00
Ayaz Salikhov
1adbed7913 ci: Use image with new remote name and newer Conan (#2539) 2025-09-05 16:28:07 +01:00
Ayaz Salikhov
490ec41083 chore: Rename conan remote (#2537) 2025-09-05 16:22:23 +01:00
Ayaz Salikhov
384e79cd32 chore: Rename conan remote in Dockerfie (#2536) 2025-09-05 15:57:18 +01:00
Peter Chen
9edc26a2a3 fix: Add MPT to txn JSON (#2392)
fixes #2332
2025-09-04 18:15:47 +01:00
Sergey Kuznetsov
08bb619964 fix: Fix data race in ClusterCommunication (#2522) 2025-09-04 18:14:53 +01:00
Sergey Kuznetsov
26ef25f864 feat: Add network id to ledger feed (#2505)
Fixes #2350.
2025-09-04 16:58:50 +01:00
Ayaz Salikhov
a62084a4f0 chore: Use mold when available without option (#2521) 2025-09-04 16:03:05 +01:00
Ayaz Salikhov
b8c298b734 chore: Update Conan to 2.20.1 (#2509) 2025-09-04 14:01:34 +01:00
Ayaz Salikhov
cf4d5d649a chore: Use conan options to only control dependencies (#2503) 2025-09-04 13:58:59 +01:00
Ayaz Salikhov
eb2778ccad ci: Use cleanup-workspace action (#2510) 2025-09-04 12:26:29 +01:00
Ayaz Salikhov
790402bcfb ci: Use conan_ref and add PR link in check_libxrpl (#2508) 2025-09-04 11:35:24 +01:00
github-actions[bot]
7c68770787 style: clang-tidy auto fixes (#2519) 2025-09-04 10:57:00 +01:00
Ayaz Salikhov
d9faf7a833 ci: Use pre-commit and pre-commit-autoupdate reusable workflows (#2506) 2025-09-03 18:23:01 +01:00
emrearıyürek
90ac03cae7 fix: Support canonical names for type in account_objects (#2437)
Fix: https://github.com/XRPLF/clio/issues/2275

---------
Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
2025-09-03 17:06:21 +01:00
Sergey Kuznetsov
3a667f558c feat: Proxy support (#2490)
Add client IP resolving support in case when there is a proxy in front
of Clio.
2025-09-03 15:22:47 +01:00
Ayaz Salikhov
0a2930d861 chore: Use xrpl/2.6.0 (#2496) 2025-09-02 17:10:15 +01:00
Sergey Kuznetsov
e86178b523 chore: Move cmake fatal error into status message (#2497) 2025-09-02 16:19:42 +01:00
Ayaz Salikhov
10e15b524f chore: Pass use_mold correctly to CMake (#2498) 2025-09-02 16:19:15 +01:00
emrearıyürek
402ab29a73 fix: Change error style in Config (#2314)
Fix: #1886

Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
2025-09-02 11:14:37 +01:00
Ayaz Salikhov
3df28f42ec ci: Use image with Python 3.13 (#2495) 2025-09-01 16:58:52 +01:00
Ayaz Salikhov
0e8896ad06 ci: Use Python 3.13 in CI docker image (#2493) 2025-09-01 16:16:22 +01:00
github-actions[bot]
ffd18049eb style: Update pre-commit hooks (#2491)
Co-authored-by: mathbunnyru <12270691+mathbunnyru@users.noreply.github.com>
2025-09-01 13:21:47 +01:00
github-actions[bot]
7413e02a05 style: clang-tidy auto fixes (#2488)
Co-authored-by: godexsoft <385326+godexsoft@users.noreply.github.com>
2025-08-29 10:31:41 +01:00
Ayaz Salikhov
0403248a8f test: Move prettyPath to its own file and test it (#2486) 2025-08-28 18:33:03 +01:00
Ayaz Salikhov
e6b2f9cde7 style: Fix JSON indent style in C++ code (#2485) 2025-08-28 15:10:36 +01:00
Peter Chen
2512a9c8e7 fix: add nft_id and offer_id to txn stream (#2335)
fixes #1804

Co-authored-by: Alex Kremer <akremer@ripple.com>
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-08-28 06:50:52 -07:00
Ayaz Salikhov
5e7f6bb5bd style: Fix JSON colon style in C++ code (#2484) 2025-08-28 13:34:25 +01:00
Alex Kremer
ae15bbd7b5 fix: ASAN issues from runSpawnWithTimeout (#2482) 2025-08-28 11:47:23 +01:00
Ayaz Salikhov
33c0737933 chore: Use prepare-runner action with dash (#2481) 2025-08-27 17:07:39 +01:00
Ayaz Salikhov
b26fcae690 ci: Use XRPLF/prepare_actions (#2473) 2025-08-27 15:49:31 +01:00
Alex Kremer
60baaf921f fix: ASAN issues in CommunicationService (#2480)
Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-08-27 14:10:57 +01:00
dependabot[bot]
f41e06061f ci: [DEPENDABOT] bump codecov/codecov-action from 5.4.3 to 5.5.0 (#2477) 2025-08-26 10:47:23 +01:00
dependabot[bot]
c170c56a84 ci: [DEPENDABOT] bump actions/upload-pages-artifact from 3 to 4 (#2476) 2025-08-26 10:44:37 +01:00
Ayaz Salikhov
c9c392679d chore: Cleanup packages installed on macOS runner (#2472) 2025-08-22 18:15:34 +01:00
Ayaz Salikhov
47f5ae5f12 fix: Fix package issues (#2469) 2025-08-22 13:01:05 +01:00
github-actions[bot]
6c34458d6c style: clang-tidy auto fixes (#2471) 2025-08-22 10:31:05 +01:00
Ayaz Salikhov
ec40cc93ff fix: Shutdown LogService after exceptions are printed (#2464) 2025-08-21 15:43:30 +01:00
Ayaz Salikhov
3681ef4e41 feat: Do not print critical errors in stdout (#2468) 2025-08-21 15:33:01 +01:00
github-actions[bot]
e2fbf56277 style: clang-tidy auto fixes (#2466) 2025-08-21 10:31:22 +01:00
Michael Legleux
2d48de372b feat: Generate Debian packages with CPack (#2282)
Builds the Linux packages with CPack. 
Generate them by running Conan with `--options:host "&:package=True"
--options:host "&:static=True"` then after the build you can run `cpack
.` in the build directory.

@mathbunnyru Where do you think this should be built? QA needs a package
per-commit.

@godexsoft What to do with the `config.json` and service file. I can
just remove them or strip the comment out but it still won't work out
the box with the default `rippled.cfg`. Relates to #2191.

---------

Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-08-21 00:07:29 -07:00
Peter Chen
c780ef8a0b refactor: output struct (#2456)
fixes #2452
2025-08-20 11:13:05 -04:00
Ayaz Salikhov
d833d36896 refactor: Add kCONFIG_DESCRIPTION_HEADER (#2463) 2025-08-20 15:29:55 +01:00
Ayaz Salikhov
7a2090bc00 docs: Add log format description (#2461) 2025-08-20 13:45:22 +01:00
Ayaz Salikhov
b5892dd139 fix: Print error and help on unknown arg (#2460) 2025-08-20 12:56:27 +01:00
Ayaz Salikhov
a172d0b7ea feat: Validate unexpected config values (#2457) 2025-08-20 12:38:51 +01:00
Peter Chen
e9ab081ab7 fix: add trustline locking flag to account_info (#2338)
fixes #2323

---------

Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
2025-08-19 10:07:21 -04:00
Ayaz Salikhov
caedb51f00 docs: Update information about minimum compilers (#2448) 2025-08-19 12:14:09 +01:00
Michael Legleux
e6abdda0a7 ci: Update build image (#2453) 2025-08-19 00:52:09 +01:00
Ayaz Salikhov
46c96654ee fix: Fix default severity in Logger (#2449) 2025-08-18 16:04:54 +01:00
Ayaz Salikhov
57ac234657 docs: Delete logging.md (#2450) 2025-08-18 15:57:26 +01:00
Ayaz Salikhov
4232359dce refactor: Put log options in log section in config (#2440) 2025-08-18 15:22:33 +01:00
Ayaz Salikhov
8b1cab46e7 ci: "Revert ci: [DEPENDABOT] bump actions/checkout from 4 to 5" (#2446) 2025-08-18 11:49:05 +01:00
dependabot[bot]
e05505aa4f ci: [DEPENDABOT] bump actions/checkout from 4 to 5 (#2443) 2025-08-18 09:56:05 +01:00
Ayaz Salikhov
73bc85864b ci: Use main repo release tag in forks (#2439) 2025-08-15 16:12:42 +01:00
Ayaz Salikhov
373430924b ci: Fix nightly docker image in forks (#2438) 2025-08-15 16:12:29 +01:00
Michael Legleux
8ad111655c ci: Add dependencies for Debian packages (#2436)
A few additional tools are required during Debian package generation.
2025-08-14 11:07:46 -07:00
Ayaz Salikhov
0a8470758d ci: Fix docker CI in forks (#2435) 2025-08-14 17:52:10 +01:00
Ayaz Salikhov
1ec906addc ci: Start using apple-clang 17 in CI (#2432) 2025-08-14 17:34:35 +01:00
Ayaz Salikhov
afc0a358d9 ci: Make CI work well in forks (#2434) 2025-08-14 13:50:16 +01:00
Ayaz Salikhov
af284dda37 test: Write more data to fill socket buffer (#2433) 2025-08-14 12:02:10 +01:00
Ayaz Salikhov
7558348d14 ci: Use image with GDB (#2430) 2025-08-13 18:54:53 +01:00
Ayaz Salikhov
0d262e74bc chore: Add GDB to docker image (#2429) 2025-08-13 13:49:24 +01:00
Ayaz Salikhov
312e7be2b4 ci: Use GCC 15.2 (#2428) 2025-08-13 11:43:53 +01:00
Ayaz Salikhov
de9b79adf0 chore: Build GCC 15.2 (#2426) 2025-08-12 19:37:44 +01:00
Alex Kremer
6c68360234 chore: Remove trufflehog (#2427) 2025-08-12 19:37:12 +01:00
Ayaz Salikhov
7e42507b9a fix: Fix GCC 15 discovered bugs (#2425) 2025-08-12 15:57:14 +01:00
Ayaz Salikhov
36bfcc7543 fix: Do not log shutdown message in LogService (#2424) 2025-08-12 15:30:19 +01:00
Ayaz Salikhov
4a5278a915 style: Mark spdlog fwd declararations as NOLINT(readability-identifie… (#2423)
…r-naming)

Fix: https://github.com/XRPLF/clio/issues/2422
2025-08-12 13:24:14 +01:00
github-actions[bot]
333b73e882 style: clang-tidy auto fixes (#2420) 2025-08-11 19:59:47 +01:00
Ayaz Salikhov
9420c506ca feat: Use spdlog logger (#2372) 2025-08-11 18:04:04 +01:00
365 changed files with 14366 additions and 7998 deletions

31
.github/actions/build-clio/action.yml vendored Normal file
View File

@@ -0,0 +1,31 @@
name: Build clio
description: Build clio in build directory
inputs:
targets:
description: Space-separated build target names
default: all
nproc_subtract:
description: The number of processors to subtract when calculating parallelism.
required: true
default: "0"
runs:
using: composite
steps:
- name: Get number of processors
uses: XRPLF/actions/.github/actions/get-nproc@046b1620f6bfd6cd0985dc82c3df02786801fe0a
id: nproc
with:
subtract: ${{ inputs.nproc_subtract }}
- name: Build targets
shell: bash
env:
CMAKE_TARGETS: ${{ inputs.targets }}
run: |
cd build
cmake \
--build . \
--parallel "${{ steps.nproc.outputs.nproc }}" \
--target ${CMAKE_TARGETS}

View File

@@ -24,6 +24,7 @@ inputs:
dockerhub_repo:
description: DockerHub repository name
required: false
default: ""
dockerhub_description:
description: Short description of the image
required: false
@@ -33,14 +34,14 @@ runs:
steps:
- name: Login to DockerHub
if: ${{ inputs.push_image == 'true' && inputs.dockerhub_repo != '' }}
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
- name: Login to GitHub Container Registry
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

View File

@@ -1,29 +0,0 @@
name: Build clio
description: Build clio in build directory
inputs:
targets:
description: Space-separated build target names
default: all
subtract_threads:
description: An option for the action get_number_of_threads. See get_number_of_threads
required: true
default: "0"
runs:
using: composite
steps:
- name: Get number of threads
uses: ./.github/actions/get_number_of_threads
id: number_of_threads
with:
subtract_threads: ${{ inputs.subtract_threads }}
- name: Build targets
shell: bash
run: |
cd build
cmake \
--build . \
--parallel "${{ steps.number_of_threads.outputs.threads_number }}" \
--target ${{ inputs.targets }}

73
.github/actions/cmake/action.yml vendored Normal file
View File

@@ -0,0 +1,73 @@
name: Run CMake
description: Run CMake to generate build files
inputs:
build_dir:
description: Build directory
required: false
default: "build"
conan_profile:
description: Conan profile name
required: true
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
integration_tests:
description: Whether to generate target integration tests
required: true
default: "true"
benchmark:
description: Whether to generate targets for benchmarks
required: true
default: "true"
code_coverage:
description: Whether to enable code coverage
required: true
default: "false"
static:
description: Whether Clio is to be statically linked
required: true
default: "false"
time_trace:
description: Whether to enable compiler trace reports
required: true
default: "false"
package:
description: Whether to generate Debian package
required: true
default: "false"
runs:
using: composite
steps:
- name: Run cmake
shell: bash
env:
BUILD_TYPE: "${{ inputs.build_type }}"
SANITIZER_OPTION: |-
${{ endsWith(inputs.conan_profile, '.asan') && '-Dsan=address' ||
endsWith(inputs.conan_profile, '.tsan') && '-Dsan=thread' ||
endsWith(inputs.conan_profile, '.ubsan') && '-Dsan=undefined' ||
'' }}
INTEGRATION_TESTS: "${{ inputs.integration_tests == 'true' && 'ON' || 'OFF' }}"
BENCHMARK: "${{ inputs.benchmark == 'true' && 'ON' || 'OFF' }}"
COVERAGE: "${{ inputs.code_coverage == 'true' && 'ON' || 'OFF' }}"
STATIC: "${{ inputs.static == 'true' && 'ON' || 'OFF' }}"
TIME_TRACE: "${{ inputs.time_trace == 'true' && 'ON' || 'OFF' }}"
PACKAGE: "${{ inputs.package == 'true' && 'ON' || 'OFF' }}"
run: |
cmake \
-B ${{inputs.build_dir}} \
-S . \
-G Ninja \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
"${SANITIZER_OPTION}" \
-Dtests=ON \
-Dintegration_tests="${INTEGRATION_TESTS}" \
-Dbenchmark="${BENCHMARK}" \
-Dcoverage="${COVERAGE}" \
-Dstatic="${STATIC}" \
-Dtime_trace="${TIME_TRACE}" \
-Dpackage="${PACKAGE}"

View File

@@ -24,7 +24,7 @@ runs:
-j8 --exclude-throw-branches
- name: Archive coverage report
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: coverage-report.xml
path: build/coverage_report.xml

38
.github/actions/conan/action.yml vendored Normal file
View File

@@ -0,0 +1,38 @@
name: Run Conan
description: Run conan to install dependencies
inputs:
build_dir:
description: Build directory
required: false
default: "build"
conan_profile:
description: Conan profile name
required: true
force_conan_source_build:
description: Whether conan should build all dependencies from source
required: true
default: "false"
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
runs:
using: composite
steps:
- name: Create build directory
shell: bash
run: mkdir -p "${{ inputs.build_dir }}"
- name: Run conan
shell: bash
env:
CONAN_BUILD_OPTION: "${{ inputs.force_conan_source_build == 'true' && '*' || 'missing' }}"
run: |
conan \
install . \
-of build \
-b "$CONAN_BUILD_OPTION" \
-s "build_type=${{ inputs.build_type }}" \
--profile:all "${{ inputs.conan_profile }}"

View File

@@ -28,12 +28,17 @@ runs:
- name: Create an issue
id: create_issue
shell: bash
env:
ISSUE_BODY: ${{ inputs.body }}
ISSUE_ASSIGNEES: ${{ inputs.assignees }}
ISSUE_LABELS: ${{ inputs.labels }}
ISSUE_TITLE: ${{ inputs.title }}
run: |
echo -e '${{ inputs.body }}' > issue.md
echo -e "${ISSUE_BODY}" > issue.md
gh issue create \
--assignee '${{ inputs.assignees }}' \
--label '${{ inputs.labels }}' \
--title '${{ inputs.title }}' \
--assignee "${ISSUE_ASSIGNEES}" \
--label "${ISSUE_LABELS}" \
--title "${ISSUE_TITLE}" \
--body-file ./issue.md \
> create_issue.log
created_issue="$(sed 's|.*/||' create_issue.log)"

View File

@@ -1,93 +0,0 @@
name: Run conan and cmake
description: Run conan and cmake
inputs:
conan_profile:
description: Conan profile name
required: true
force_conan_source_build:
description: Whether conan should build all dependencies from source
required: true
default: "false"
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
build_integration_tests:
description: Whether to build integration tests
required: true
default: "true"
build_benchmark:
description: Whether to build benchmark tests
required: true
default: "true"
code_coverage:
description: Whether conan's coverage option should be on or not
required: true
default: "false"
static:
description: Whether Clio is to be statically linked
required: true
default: "false"
time_trace:
description: Whether to enable compiler trace reports
required: true
default: "false"
use_mold:
description: Whether to use mold linker
required: true
default: "false"
runs:
using: composite
steps:
- name: Create build directory
shell: bash
run: mkdir -p build
- name: Run conan
shell: bash
env:
CONAN_BUILD_OPTION: "${{ inputs.force_conan_source_build == 'true' && '*' || 'missing' }}"
CODE_COVERAGE: "${{ inputs.code_coverage == 'true' && 'True' || 'False' }}"
STATIC_OPTION: "${{ inputs.static == 'true' && 'True' || 'False' }}"
INTEGRATION_TESTS_OPTION: "${{ inputs.build_integration_tests == 'true' && 'True' || 'False' }}"
BENCHMARK_OPTION: "${{ inputs.build_benchmark == 'true' && 'True' || 'False' }}"
TIME_TRACE: "${{ inputs.time_trace == 'true' && 'True' || 'False' }}"
USE_MOLD: "${{ inputs.use_mold == 'true' && 'True' || 'False' }}"
run: |
cd build
conan \
install .. \
-of . \
-b "$CONAN_BUILD_OPTION" \
-s "build_type=${{ inputs.build_type }}" \
-o "&:static=${STATIC_OPTION}" \
-o "&:tests=True" \
-o "&:integration_tests=${INTEGRATION_TESTS_OPTION}" \
-o "&:benchmark=${BENCHMARK_OPTION}" \
-o "&:lint=False" \
-o "&:coverage=${CODE_COVERAGE}" \
-o "&:time_trace=${TIME_TRACE}" \
-o "&:use_mold=${USE_MOLD}" \
--profile:all "${{ inputs.conan_profile }}"
- name: Run cmake
shell: bash
env:
BUILD_TYPE: "${{ inputs.build_type }}"
SANITIZER_OPTION: |-
${{ endsWith(inputs.conan_profile, '.asan') && '-Dsan=address' ||
endsWith(inputs.conan_profile, '.tsan') && '-Dsan=thread' ||
endsWith(inputs.conan_profile, '.ubsan') && '-Dsan=undefined' ||
'' }}
USE_MOLD_OPTION: "${{ inputs.use_mold == 'true' && '-DUSE_MOLD=ON' || '' }}"
run: |
cd build
cmake \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
"${USE_MOLD_OPTION}" \
"${SANITIZER_OPTION}" \
.. \
-G Ninja

View File

@@ -1,36 +0,0 @@
name: Get number of threads
description: Determines number of threads to use on macOS and Linux
inputs:
subtract_threads:
description: How many threads to subtract from the calculated number
required: true
default: "0"
outputs:
threads_number:
description: Number of threads to use
value: ${{ steps.number_of_threads_export.outputs.num }}
runs:
using: composite
steps:
- name: Get number of threads on mac
id: mac_threads
if: ${{ runner.os == 'macOS' }}
shell: bash
run: echo "num=$(($(sysctl -n hw.logicalcpu) - 2))" >> $GITHUB_OUTPUT
- name: Get number of threads on Linux
id: linux_threads
if: ${{ runner.os == 'Linux' }}
shell: bash
run: echo "num=$(($(nproc) - 2))" >> $GITHUB_OUTPUT
- name: Shift and export number of threads
id: number_of_threads_export
shell: bash
run: |
num_of_threads="${{ steps.mac_threads.outputs.num || steps.linux_threads.outputs.num }}"
shift_by="${{ inputs.subtract_threads }}"
shifted="$((num_of_threads - shift_by))"
echo "num=$(( shifted > 1 ? shifted : 1 ))" >> $GITHUB_OUTPUT

View File

@@ -1,59 +0,0 @@
name: Prepare runner
description: Install packages, set environment variables, create directories
inputs:
disable_ccache:
description: Whether ccache should be disabled
required: true
runs:
using: composite
steps:
- name: Install packages on mac
if: ${{ runner.os == 'macOS' }}
shell: bash
run: |
brew install --quiet \
bison \
ca-certificates \
ccache \
clang-build-analyzer \
cmake \
conan \
gh \
jq \
llvm@14 \
ninja \
pkg-config
echo "/opt/homebrew/opt/conan@2/bin" >> $GITHUB_PATH
- name: Fix git permissions on Linux
if: ${{ runner.os == 'Linux' }}
shell: bash
run: git config --global --add safe.directory "$PWD"
- name: Set env variables for macOS
if: ${{ runner.os == 'macOS' }}
shell: bash
run: |
echo "CCACHE_DIR=${{ github.workspace }}/.ccache" >> $GITHUB_ENV
echo "CONAN_HOME=${{ github.workspace }}/.conan2" >> $GITHUB_ENV
- name: Set env variables for Linux
if: ${{ runner.os == 'Linux' }}
shell: bash
run: |
echo "CCACHE_DIR=/root/.ccache" >> $GITHUB_ENV
echo "CONAN_HOME=/root/.conan2" >> $GITHUB_ENV
- name: Set CCACHE_DISABLE=1
if: ${{ inputs.disable_ccache == 'true' }}
shell: bash
run: |
echo "CCACHE_DISABLE=1" >> $GITHUB_ENV
- name: Create directories
shell: bash
run: |
mkdir -p "$CCACHE_DIR"
mkdir -p "$CONAN_HOME"

View File

@@ -27,10 +27,10 @@ runs:
steps:
- name: Find common commit
id: git_common_ancestor
uses: ./.github/actions/git_common_ancestor
uses: ./.github/actions/git-common-ancestor
- name: Restore ccache cache
uses: actions/cache/restore@v4
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
id: ccache_cache
if: ${{ env.CCACHE_DISABLE != '1' }}
with:

View File

@@ -28,11 +28,11 @@ runs:
steps:
- name: Find common commit
id: git_common_ancestor
uses: ./.github/actions/git_common_ancestor
uses: ./.github/actions/git-common-ancestor
- name: Save ccache cache
if: ${{ inputs.ccache_cache_hit != 'true' || inputs.ccache_cache_miss_rate == '100.0' }}
uses: actions/cache/save@v4
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ inputs.ccache_dir }}
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-${{ inputs.conan_profile }}-develop-${{ steps.git_common_ancestor.outputs.commit }}

View File

@@ -14,7 +14,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build_clio/
directory: .github/actions/build-clio/
schedule:
interval: weekly
day: monday
@@ -27,7 +27,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build_docker_image/
directory: .github/actions/build-docker-image/
schedule:
interval: weekly
day: monday
@@ -40,7 +40,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/code_coverage/
directory: .github/actions/cmake/
schedule:
interval: weekly
day: monday
@@ -53,7 +53,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/create_issue/
directory: .github/actions/code-coverage/
schedule:
interval: weekly
day: monday
@@ -66,7 +66,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/generate/
directory: .github/actions/conan/
schedule:
interval: weekly
day: monday
@@ -79,7 +79,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/get_number_of_threads/
directory: .github/actions/create-issue/
schedule:
interval: weekly
day: monday
@@ -92,7 +92,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/git_common_ancestor/
directory: .github/actions/git-common-ancestor/
schedule:
interval: weekly
day: monday
@@ -105,7 +105,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/prepare_runner/
directory: .github/actions/restore-cache/
schedule:
interval: weekly
day: monday
@@ -118,20 +118,7 @@ updates:
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/restore_cache/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/save_cache/
directory: .github/actions/save-cache/
schedule:
interval: weekly
day: monday

View File

@@ -1,8 +0,0 @@
[settings]
arch={{detect_api.detect_arch()}}
build_type=Release
compiler=apple-clang
compiler.cppstd=20
compiler.libcxx=libc++
compiler.version=16
os=Macos

View File

@@ -3,7 +3,9 @@ import itertools
import json
LINUX_OS = ["heavy", "heavy-arm64"]
LINUX_CONTAINERS = ['{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }']
LINUX_CONTAINERS = [
'{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
]
LINUX_COMPILERS = ["gcc", "clang"]
MACOS_OS = ["macos15"]

View File

@@ -8,10 +8,11 @@ REPO_DIR="$(cd "$CURRENT_DIR/../../../" && pwd)"
CONAN_DIR="${CONAN_HOME:-$HOME/.conan2}"
PROFILES_DIR="$CONAN_DIR/profiles"
# When developers' compilers are updated, these profiles might be different
if [[ -z "$CI" ]]; then
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-local.profile"
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-17.profile"
else
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-ci.profile"
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-17.profile"
fi
GCC_PROFILE="$REPO_DIR/docker/ci/conan/gcc.profile"
@@ -21,7 +22,7 @@ SANITIZER_TEMPLATE_FILE="$REPO_DIR/docker/ci/conan/sanitizer_template.profile"
rm -rf "$CONAN_DIR"
conan remote add --index 0 ripple https://conan.ripplex.io
conan remote add --index 0 xrplf https://conan.ripplex.io
cp "$REPO_DIR/docker/ci/conan/global.conf" "$CONAN_DIR/global.conf"

View File

@@ -31,15 +31,16 @@ TESTS=$($TEST_BINARY --gtest_list_tests | awk '/^ / {print suite $1} !/^ / {su
OUTPUT_DIR="./.sanitizer-report"
mkdir -p "$OUTPUT_DIR"
export TSAN_OPTIONS="die_after_fork=0"
export MallocNanoZone='0' # for MacOSX
for TEST in $TESTS; do
OUTPUT_FILE="$OUTPUT_DIR/${TEST//\//_}"
export TSAN_OPTIONS="log_path=\"$OUTPUT_FILE\" die_after_fork=0"
export ASAN_OPTIONS="log_path=\"$OUTPUT_FILE\""
export UBSAN_OPTIONS="log_path=\"$OUTPUT_FILE\""
export MallocNanoZone='0' # for MacOSX
$TEST_BINARY --gtest_filter="$TEST" > /dev/null 2>&1
OUTPUT_FILE="$OUTPUT_DIR/${TEST//\//_}.log"
$TEST_BINARY --gtest_filter="$TEST" > "$OUTPUT_FILE" 2>&1
if [ $? -ne 0 ]; then
echo "'$TEST' failed a sanitizer check."
else
rm "$OUTPUT_FILE"
fi
done

View File

@@ -44,11 +44,11 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Download Clio binary from artifact
if: ${{ inputs.artifact_name != null }}
uses: actions/download-artifact@v5
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: ${{ inputs.artifact_name }}
path: ./docker/clio/artifact/
@@ -56,9 +56,12 @@ jobs:
- name: Download Clio binary from url
if: ${{ inputs.clio_server_binary_url != null }}
shell: bash
env:
BINARY_URL: ${{ inputs.clio_server_binary_url }}
BINARY_SHA256: ${{ inputs.binary_sha256 }}
run: |
wget "${{inputs.clio_server_binary_url}}" -P ./docker/clio/artifact/
if [ "$(sha256sum ./docker/clio/clio_server | awk '{print $1}')" != "${{inputs.binary_sha256}}" ]; then
wget "${BINARY_URL}" -P ./docker/clio/artifact/
if [ "$(sha256sum ./docker/clio/clio_server | awk '{print $1}')" != "${BINARY_SHA256}" ]; then
echo "Binary sha256 sum doesn't match"
exit 1
fi
@@ -83,19 +86,24 @@ jobs:
shell: bash
run: strip ./docker/clio/clio_server
- name: Set GHCR_REPO
id: set-ghcr-repo
run: |
echo "GHCR_REPO=$(echo ghcr.io/${{ github.repository_owner }} | tr '[:upper:]' '[:lower:]')" >> ${GITHUB_OUTPUT}
- name: Build Docker image
uses: ./.github/actions/build_docker_image
uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
ghcr.io/xrplf/clio
rippleci/clio
ghcr.io/${{ steps.set-ghcr-repo.outputs.GHCR_REPO }}/clio
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio' || '' }}
push_image: ${{ inputs.publish_image }}
directory: docker/clio
tags: ${{ inputs.tags }}
platforms: linux/amd64
dockerhub_repo: rippleci/clio
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio' || '' }}
dockerhub_description: Clio is an XRP Ledger API server.

View File

@@ -8,14 +8,14 @@ on:
paths:
- .github/workflows/build.yml
- .github/workflows/build_and_test.yml
- .github/workflows/build_impl.yml
- .github/workflows/test_impl.yml
- .github/workflows/upload_coverage_report.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- .github/workflows/reusable-upload-coverage-report.yml
- ".github/actions/**"
- "!.github/actions/build_docker_image/**"
- "!.github/actions/create_issue/**"
- "!.github/actions/build-docker-image/**"
- "!.github/actions/create-issue/**"
- CMakeLists.txt
- conanfile.py
@@ -45,7 +45,7 @@ jobs:
build_type: [Release, Debug]
container:
[
'{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }',
'{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }',
]
static: [true]
@@ -56,12 +56,14 @@ jobs:
container: ""
static: false
uses: ./.github/workflows/build_and_test.yml
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
download_ccache: true
upload_ccache: true
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: false
@@ -70,13 +72,14 @@ jobs:
code_coverage:
name: Run Code Coverage
uses: ./.github/workflows/build_impl.yml
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
conan_profile: gcc
build_type: Debug
disable_cache: false
download_ccache: true
upload_ccache: false
code_coverage: true
static: true
upload_clio_server: false
@@ -85,17 +88,35 @@ jobs:
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
package:
name: Build packages
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
conan_profile: gcc
build_type: Release
download_ccache: true
upload_ccache: false
code_coverage: false
static: true
upload_clio_server: false
package: true
targets: package
analyze_build_time: false
check_config:
name: Check Config Description
needs: build-and-test
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: actions/download-artifact@v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: clio_server_Linux_Release_gcc

View File

@@ -17,42 +17,47 @@ jobs:
name: Build Clio / `libXRPL ${{ github.event.client_payload.version }}`
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Update libXRPL version requirement
shell: bash
run: |
sed -i.bak -E "s|'xrpl/[a-zA-Z0-9\\.\\-]+'|'xrpl/${{ github.event.client_payload.version }}'|g" conanfile.py
sed -i.bak -E "s|'xrpl/[a-zA-Z0-9\\.\\-]+'|'xrpl/${{ github.event.client_payload.conan_ref }}'|g" conanfile.py
rm -f conanfile.py.bak
- name: Update conan lockfile
shell: bash
run: |
conan lock create . -o '&:tests=True' -o '&:benchmark=True' --profile:all ${{ env.CONAN_PROFILE }}
conan lock create . --profile:all ${{ env.CONAN_PROFILE }}
- name: Run conan and cmake
uses: ./.github/actions/generate
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Build Clio
uses: ./.github/actions/build_clio
uses: ./.github/actions/build-clio
- name: Strip tests
run: strip build/clio_tests
- name: Upload clio_tests
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: clio_tests_check_libxrpl
path: build/clio_tests
@@ -62,10 +67,10 @@ jobs:
needs: build
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- uses: actions/download-artifact@v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: clio_tests_check_libxrpl
@@ -85,16 +90,17 @@ jobs:
issues: write
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Create an issue
uses: ./.github/actions/create_issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
labels: "compatibility,bug"
title: "Proposed libXRPL check failed"
body: >
Clio build or tests failed against `libXRPL ${{ github.event.client_payload.version }}`.
Clio build or tests failed against `libXRPL ${{ github.event.client_payload.conan_ref }}`.
Workflow: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/
PR: ${{ github.event.client_payload.pr_url }}
Workflow run: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/

View File

@@ -10,8 +10,17 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: ytanikin/pr-conventional-commits@b72758283dcbee706975950e96bc4bf323a8d8c0 # v1.4.2
- uses: ytanikin/pr-conventional-commits@b72758283dcbee706975950e96bc4bf323a8d8c0 # 1.4.2
with:
task_types: '["build","feat","fix","docs","test","ci","style","refactor","perf","chore"]'
add_label: false
custom_labels: '{"build":"build", "feat":"enhancement", "fix":"bug", "docs":"documentation", "test":"testability", "ci":"ci", "style":"refactoring", "refactor":"refactoring", "perf":"performance", "chore":"tooling"}'
- name: Check if message starts with upper-case letter
env:
PR_TITLE: ${{ github.event.pull_request.title }}
run: |
if [[ ! "${PR_TITLE}" =~ ^[a-z]+:\ [\[A-Z] ]]; then
echo "Error: PR title must start with an upper-case letter."
exit 1
fi

View File

@@ -1,6 +1,8 @@
name: Clang-tidy check
on:
push:
branches: [develop]
schedule:
- cron: "0 9 * * 1-5"
workflow_dispatch:
@@ -22,9 +24,10 @@ env:
jobs:
clang_tidy:
if: github.event_name != 'push' || contains(github.event.head_commit.message, 'clang-tidy auto fixes')
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
permissions:
contents: write
@@ -32,37 +35,42 @@ jobs:
pull-requests: write
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Restore cache
uses: ./.github/actions/restore_cache
uses: ./.github/actions/restore-cache
id: restore_cache
with:
conan_profile: ${{ env.CONAN_PROFILE }}
ccache_dir: ${{ env.CCACHE_DIR }}
- name: Run conan and cmake
uses: ./.github/actions/generate
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Get number of threads
uses: ./.github/actions/get_number_of_threads
id: number_of_threads
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Get number of processors
uses: XRPLF/actions/.github/actions/get-nproc@046b1620f6bfd6cd0985dc82c3df02786801fe0a
id: nproc
- name: Run clang-tidy
continue-on-error: true
shell: bash
id: run_clang_tidy
run: |
run-clang-tidy-${{ env.LLVM_TOOLS_VERSION }} -p build -j "${{ steps.number_of_threads.outputs.threads_number }}" -fix -quiet 1>output.txt
run-clang-tidy-${{ env.LLVM_TOOLS_VERSION }} -p build -j "${{ steps.nproc.outputs.nproc }}" -fix -quiet 1>output.txt
- name: Fix local includes and clang-format style
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
@@ -82,7 +90,7 @@ jobs:
- name: Create an issue
if: ${{ steps.run_clang_tidy.outcome != 'success' && github.event_name != 'pull_request' }}
id: create_issue
uses: ./.github/actions/create_issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:

View File

@@ -1,30 +0,0 @@
name: Restart clang-tidy workflow
on:
push:
branches: [develop]
workflow_dispatch:
jobs:
restart_clang_tidy:
runs-on: ubuntu-latest
permissions:
actions: write
steps:
- uses: actions/checkout@v4
- name: Check last commit matches clang-tidy auto fixes
id: check
shell: bash
run: |
passed=$(if [[ "$(git log -1 --pretty=format:%s | grep 'style: clang-tidy auto fixes')" ]]; then echo 'true' ; else echo 'false' ; fi)
echo "passed=\"$passed\"" >> $GITHUB_OUTPUT
- name: Run clang-tidy workflow
if: ${{ contains(steps.check.outputs.passed, 'true') }}
shell: bash
env:
GH_TOKEN: ${{ github.token }}
GH_REPO: ${{ github.repository }}
run: gh workflow run clang-tidy.yml

View File

@@ -14,16 +14,16 @@ jobs:
build:
runs-on: ubuntu-latest
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
lfs: true
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
@@ -39,10 +39,10 @@ jobs:
run: cmake --build . --target docs
- name: Setup Pages
uses: actions/configure-pages@v5
uses: actions/configure-pages@983d7736d9b0ae728b81ab479565c72886d7745b # v5.0.0
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
uses: actions/upload-pages-artifact@7b1f4a764d45c48632c6b24a0339c27f5614fb0b # v4.0.0
with:
path: build_docs/html
name: docs-develop
@@ -62,6 +62,6 @@ jobs:
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4
uses: actions/deploy-pages@d6db90164ac5ed86f2b6aed7e0febac5b3c0c03e # v4.0.5
with:
artifact_name: docs-develop

View File

@@ -8,14 +8,14 @@ on:
paths:
- .github/workflows/nightly.yml
- .github/workflows/release_impl.yml
- .github/workflows/build_and_test.yml
- .github/workflows/build_impl.yml
- .github/workflows/test_impl.yml
- .github/workflows/build_clio_docker_image.yml
- .github/workflows/reusable-release.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- .github/workflows/build-clio-docker-image.yml
- ".github/actions/**"
- "!.github/actions/code_coverage/**"
- "!.github/actions/code-coverage/**"
- .github/scripts/prepare-release-artifacts.sh
concurrency:
@@ -39,19 +39,19 @@ jobs:
conan_profile: gcc
build_type: Release
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
- os: heavy
conan_profile: gcc
build_type: Debug
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
- os: heavy
conan_profile: gcc.ubsan
build_type: Release
static: false
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
uses: ./.github/workflows/build_and_test.yml
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
@@ -61,7 +61,8 @@ jobs:
run_unit_tests: true
run_integration_tests: true
upload_clio_server: true
disable_cache: true
download_ccache: false
upload_ccache: false
analyze_build_time:
name: Analyze Build Time
@@ -72,19 +73,20 @@ jobs:
include:
- os: heavy
conan_profile: clang
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
static: true
- os: macos15
conan_profile: apple-clang
container: ""
static: false
uses: ./.github/workflows/build_impl.yml
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: Release
disable_cache: true
download_ccache: false
upload_ccache: false
code_coverage: false
static: ${{ matrix.static }}
upload_clio_server: false
@@ -93,7 +95,7 @@ jobs:
nightly_release:
needs: build-and-test
uses: ./.github/workflows/release_impl.yml
uses: ./.github/workflows/reusable-release.yml
with:
overwrite_release: true
prerelease: true
@@ -107,7 +109,7 @@ jobs:
draft: false
build_and_publish_docker_image:
uses: ./.github/workflows/build_clio_docker_image.yml
uses: ./.github/workflows/build-clio-docker-image.yml
needs: build-and-test
secrets: inherit
with:
@@ -128,10 +130,10 @@ jobs:
issues: write
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Create an issue
uses: ./.github/actions/create_issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:

View File

@@ -4,47 +4,19 @@ on:
# every first day of the month
schedule:
- cron: "0 0 1 * *"
# on demand
pull_request:
branches: [release/*, develop]
paths:
- ".pre-commit-config.yaml"
workflow_dispatch:
jobs:
auto-update:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: 3.x
- run: pip install pre-commit
- run: pre-commit autoupdate --freeze
- run: pre-commit run --all-files || true
- uses: crazy-max/ghaction-import-gpg@e89d40939c28e39f97cf32126055eeae86ba74ec # v6.3.0
if: github.event_name != 'pull_request'
with:
gpg_private_key: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.ACTIONS_GPG_PASSPHRASE }}
git_user_signingkey: true
git_commit_gpgsign: true
- uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
if: always()
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
with:
commit-message: "style: Update pre-commit hooks"
committer: Clio CI <skuznetsov@ripple.com>
branch: update/pre-commit-hooks
branch-suffix: timestamp
delete-branch: true
title: "style: Update pre-commit hooks"
body: Update versions of pre-commit hooks to latest version.
reviewers: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"
uses: XRPLF/actions/.github/workflows/pre-commit-autoupdate.yml@afbcbdafbe0ce5439492fb87eda6441371086386
with:
sign_commit: true
committer: "Clio CI <skuznetsov@ripple.com>"
reviewers: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"
secrets:
GPG_PRIVATE_KEY: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
GPG_PASSPHRASE: ${{ secrets.ACTIONS_GPG_PASSPHRASE }}

View File

@@ -8,20 +8,7 @@ on:
jobs:
run-hooks:
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
steps:
- name: Checkout Repo ⚡️
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: true
- name: Run pre-commit ✅
run: pre-commit run --all-files
uses: XRPLF/actions/.github/workflows/pre-commit.yml@34790936fae4c6c751f62ec8c06696f9c1a5753a
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-pre-commit:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'

View File

@@ -29,9 +29,9 @@ jobs:
conan_profile: gcc
build_type: Release
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
uses: ./.github/workflows/build_and_test.yml
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
@@ -41,18 +41,19 @@ jobs:
run_unit_tests: true
run_integration_tests: true
upload_clio_server: true
disable_cache: true
download_ccache: false
upload_ccache: false
expected_version: ${{ github.event_name == 'push' && github.ref_name || '' }}
release:
needs: build-and-test
uses: ./.github/workflows/release_impl.yml
uses: ./.github/workflows/reusable-release.yml
with:
overwrite_release: false
prerelease: ${{ contains(github.ref_name, '-') }}
title: "${{ github.ref_name}}"
title: "${{ github.ref_name }}"
version: "${{ github.ref_name }}"
header: >
${{ contains(github.ref_name, '-') && '> **Note:** Please remember that this is a release candidate and it is not recommended for production use.' || '' }}
generate_changelog: ${{ !contains(github.ref_name, '-') }}
draft: true
draft: ${{ !contains(github.ref_name, '-') }}

View File

@@ -23,8 +23,14 @@ on:
required: true
type: string
disable_cache:
description: Whether ccache should be disabled
download_ccache:
description: Whether to download ccache from the cache
required: false
type: boolean
default: true
upload_ccache:
description: Whether to upload ccache to the cache
required: false
type: boolean
default: false
@@ -63,25 +69,33 @@ on:
type: string
default: ""
package:
description: Whether to generate Debian package
required: false
type: boolean
default: false
jobs:
build:
uses: ./.github/workflows/build_impl.yml
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
disable_cache: ${{ inputs.disable_cache }}
download_ccache: ${{ inputs.download_ccache }}
upload_ccache: ${{ inputs.upload_ccache }}
code_coverage: false
static: ${{ inputs.static }}
upload_clio_server: ${{ inputs.upload_clio_server }}
targets: ${{ inputs.targets }}
analyze_build_time: false
expected_version: ${{ inputs.expected_version }}
package: ${{ inputs.package }}
test:
needs: build
uses: ./.github/workflows/test_impl.yml
uses: ./.github/workflows/reusable-test.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}

View File

@@ -23,10 +23,17 @@ on:
required: true
type: string
disable_cache:
description: Whether ccache should be disabled
download_ccache:
description: Whether to download ccache from the cache
required: false
type: boolean
default: true
upload_ccache:
description: Whether to upload ccache to the cache
required: false
type: boolean
default: false
code_coverage:
description: Whether to enable code coverage
@@ -59,6 +66,11 @@ on:
type: string
default: ""
package:
description: Whether to generate Debian package
required: false
type: boolean
secrets:
CODECOV_TOKEN:
required: false
@@ -70,11 +82,11 @@ jobs:
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
steps:
- name: Clean workdir
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
# We need to fetch tags to have correct version in the release
@@ -83,18 +95,18 @@ jobs:
ref: ${{ github.ref }}
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: ${{ inputs.disable_cache }}
disable_ccache: ${{ !inputs.download_ccache }}
- name: Setup conan on macOS
if: runner.os == 'macOS'
if: ${{ runner.os == 'macOS' }}
shell: bash
run: ./.github/scripts/conan/init.sh
- name: Restore cache
if: ${{ !inputs.disable_cache }}
uses: ./.github/actions/restore_cache
if: ${{ inputs.download_ccache }}
uses: ./.github/actions/restore-cache
id: restore_cache
with:
conan_profile: ${{ inputs.conan_profile }}
@@ -102,18 +114,24 @@ jobs:
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
- name: Run conan and cmake
uses: ./.github/actions/generate
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
static: ${{ inputs.static }}
time_trace: ${{ inputs.analyze_build_time }}
use_mold: ${{ runner.os != 'macOS' }}
package: ${{ inputs.package }}
- name: Build Clio
uses: ./.github/actions/build_clio
uses: ./.github/actions/build-clio
with:
targets: ${{ inputs.targets }}
@@ -127,13 +145,13 @@ jobs:
- name: Upload build time analyze report
if: ${{ inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: build_time_report_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build_time_report.txt
- name: Show ccache's statistics
if: ${{ !inputs.disable_cache }}
if: ${{ inputs.download_ccache }}
shell: bash
id: ccache_stats
run: |
@@ -151,29 +169,36 @@ jobs:
run: strip build/clio_integration_tests
- name: Upload clio_server
if: inputs.upload_clio_server && !inputs.code_coverage && !inputs.analyze_build_time
uses: actions/upload-artifact@v4
if: ${{ inputs.upload_clio_server && !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: clio_server_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_server
- name: Upload clio_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time && !inputs.package }}
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_tests
- name: Upload clio_integration_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time && !inputs.package }}
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_integration_tests
- name: Upload Clio Linux package
if: ${{ inputs.package }}
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: clio_deb_package_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/*.deb
- name: Save cache
if: ${{ !inputs.disable_cache && github.ref == 'refs/heads/develop' }}
uses: ./.github/actions/save_cache
if: ${{ inputs.upload_ccache && github.ref == 'refs/heads/develop' }}
uses: ./.github/actions/save-cache
with:
conan_profile: ${{ inputs.conan_profile }}
ccache_dir: ${{ env.CCACHE_DIR }}
@@ -191,17 +216,19 @@ jobs:
# It's all available in the build job, but not in the test job
- name: Run code coverage
if: ${{ inputs.code_coverage }}
uses: ./.github/actions/code_coverage
uses: ./.github/actions/code-coverage
- name: Verify expected version
if: ${{ inputs.expected_version != '' }}
shell: bash
env:
INPUT_EXPECTED_VERSION: ${{ inputs.expected_version }}
run: |
set -e
EXPECTED_VERSION="clio-${{ inputs.expected_version }}"
EXPECTED_VERSION="clio-${INPUT_EXPECTED_VERSION}"
actual_version=$(./build/clio_server --version)
if [[ "$actual_version" != "$EXPECTED_VERSION" ]]; then
echo "Expected version '$EXPECTED_VERSION', but got '$actual_version'"
echo "Expected version '${EXPECTED_VERSION}', but got '${actual_version}'"
exit 1
fi
@@ -213,6 +240,6 @@ jobs:
if: ${{ inputs.code_coverage }}
name: Codecov
needs: build
uses: ./.github/workflows/upload_coverage_report.yml
uses: ./.github/workflows/reusable-upload-coverage-report.yml
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -42,7 +42,7 @@ jobs:
release:
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
@@ -51,32 +51,34 @@ jobs:
contents: write
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- uses: actions/download-artifact@v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: release_artifacts
pattern: clio_server_*
- name: Create release notes
shell: bash
env:
RELEASE_HEADER: ${{ inputs.header }}
run: |
echo "# Release notes" > "${RUNNER_TEMP}/release_notes.md"
echo "" >> "${RUNNER_TEMP}/release_notes.md"
printf '%s\n' "${{ inputs.header }}" >> "${RUNNER_TEMP}/release_notes.md"
printf '%s\n' "${RELEASE_HEADER}" >> "${RUNNER_TEMP}/release_notes.md"
- name: Generate changelog
shell: bash
if: ${{ inputs.generate_changelog }}
run: |
LAST_TAG="$(gh release view --json tagName -q .tagName)"
LAST_TAG="$(gh release view --json tagName -q .tagName --repo XRPLF/clio)"
LAST_TAG_COMMIT="$(git rev-parse $LAST_TAG)"
BASE_COMMIT="$(git merge-base HEAD $LAST_TAG_COMMIT)"
git-cliff "${BASE_COMMIT}..HEAD" --ignore-tags "nightly|-b|-rc"
@@ -87,7 +89,7 @@ jobs:
run: .github/scripts/prepare-release-artifacts.sh release_artifacts
- name: Upload release notes
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: release_notes_${{ inputs.version }}
path: "${RUNNER_TEMP}/release_notes.md"
@@ -95,18 +97,25 @@ jobs:
- name: Remove current release and tag
if: ${{ github.event_name != 'pull_request' && inputs.overwrite_release }}
shell: bash
env:
RELEASE_VERSION: ${{ inputs.version }}
run: |
gh release delete ${{ inputs.version }} --yes || true
git push origin :${{ inputs.version }} || true
gh release delete "${RELEASE_VERSION}" --yes || true
git push origin :"${RELEASE_VERSION}" || true
- name: Publish release
if: ${{ github.event_name != 'pull_request' }}
shell: bash
env:
RELEASE_VERSION: ${{ inputs.version }}
PRERELEASE_OPTION: ${{ inputs.prerelease && '--prerelease' || '' }}
RELEASE_TITLE: ${{ inputs.title }}
DRAFT_OPTION: ${{ inputs.draft && '--draft' || '' }}
run: |
gh release create "${{ inputs.version }}" \
${{ inputs.prerelease && '--prerelease' || '' }} \
--title "${{ inputs.title }}" \
gh release create "${RELEASE_VERSION}" \
${PRERELEASE_OPTION} \
--title "${RELEASE_TITLE}" \
--target "${GITHUB_SHA}" \
${{ inputs.draft && '--draft' || '' }} \
${DRAFT_OPTION} \
--notes-file "${RUNNER_TEMP}/release_notes.md" \
./release_artifacts/clio_server*

View File

@@ -39,22 +39,22 @@ jobs:
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: inputs.run_unit_tests
if: ${{ inputs.run_unit_tests }}
env:
# TODO: remove completely when we have fixed all currently existing issues with sanitizers
SANITIZER_IGNORE_ERRORS: ${{ endsWith(inputs.conan_profile, '.asan') || endsWith(inputs.conan_profile, '.tsan') }}
SANITIZER_IGNORE_ERRORS: ${{ endsWith(inputs.conan_profile, '.tsan') || (inputs.conan_profile == 'gcc.asan' && inputs.build_type == 'Release') }}
steps:
- name: Clean workdir
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- uses: actions/download-artifact@v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
@@ -63,15 +63,15 @@ jobs:
run: chmod +x ./clio_tests
- name: Run clio_tests (regular)
if: env.SANITIZER_IGNORE_ERRORS == 'false'
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'false' }}
run: ./clio_tests
- name: Run clio_tests (sanitizer errors ignored)
if: env.SANITIZER_IGNORE_ERRORS == 'true'
run: ./.github/scripts/execute-tests-under-sanitizer ./clio_tests
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' }}
run: ./.github/scripts/execute-tests-under-sanitizer.sh ./clio_tests
- name: Check for sanitizer report
if: env.SANITIZER_IGNORE_ERRORS == 'true'
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' }}
shell: bash
id: check_report
run: |
@@ -82,16 +82,16 @@ jobs:
fi
- name: Upload sanitizer report
if: env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true'
uses: actions/upload-artifact@v4
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true' }}
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: sanitizer_report_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: .sanitizer-report/*
include-hidden-files: true
- name: Create an issue
if: false && env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true'
uses: ./.github/actions/create_issue
if: ${{ false && env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true' }}
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
@@ -108,7 +108,7 @@ jobs:
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: inputs.run_integration_tests
if: ${{ inputs.run_integration_tests }}
services:
scylladb:
@@ -120,9 +120,9 @@ jobs:
--health-retries 5
steps:
- name: Clean workdir
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- name: Spin up scylladb
if: ${{ runner.os == 'macOS' }}
@@ -144,7 +144,7 @@ jobs:
sleep 5
done
- uses: actions/download-artifact@v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}

View File

@@ -1,7 +1,6 @@
name: Upload report
on:
workflow_dispatch:
workflow_call:
secrets:
CODECOV_TOKEN:
@@ -13,19 +12,19 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
with:
fetch-depth: 0
- name: Download report artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
name: coverage-report.xml
path: build
- name: Upload coverage report
if: ${{ hashFiles('build/coverage_report.xml') != '' }}
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
files: build/coverage_report.xml
fail_ci_if_error: true

View File

@@ -8,14 +8,14 @@ on:
paths:
- .github/workflows/sanitizers.yml
- .github/workflows/build_and_test.yml
- .github/workflows/build_impl.yml
- .github/workflows/test_impl.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- ".github/actions/**"
- "!.github/actions/build_docker_image/**"
- "!.github/actions/create_issue/**"
- .github/scripts/execute-tests-under-sanitizer
- "!.github/actions/build-docker-image/**"
- "!.github/actions/create-issue/**"
- .github/scripts/execute-tests-under-sanitizer.sh
- CMakeLists.txt
- conanfile.py
@@ -41,11 +41,12 @@ jobs:
sanitizer_ext: [.asan, .tsan, .ubsan]
build_type: [Release, Debug]
uses: ./.github/workflows/build_and_test.yml
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5" }'
disable_cache: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
download_ccache: false
upload_ccache: false
conan_profile: ${{ matrix.compiler }}${{ matrix.sanitizer_ext }}
build_type: ${{ matrix.build_type }}
static: false

View File

@@ -3,23 +3,23 @@ name: Update CI docker image
on:
pull_request:
paths:
- .github/workflows/update_docker_ci.yml
- .github/workflows/update-docker-ci.yml
- ".github/actions/build_docker_image/**"
- ".github/actions/build-docker-image/**"
- "docker/ci/**"
- "docker/compilers/**"
- "docker/tools/**"
- "docker/**"
- "!docker/clio/**"
- "!docker/develop/**"
push:
branches: [develop]
paths:
- .github/workflows/update_docker_ci.yml
- .github/workflows/update-docker-ci.yml
- ".github/actions/build_docker_image/**"
- ".github/actions/build-docker-image/**"
- "docker/ci/**"
- "docker/compilers/**"
- "docker/tools/**"
- "docker/**"
- "!docker/clio/**"
- "!docker/develop/**"
workflow_dispatch:
concurrency:
@@ -30,8 +30,8 @@ concurrency:
env:
CLANG_MAJOR_VERSION: 19
GCC_MAJOR_VERSION: 14
GCC_VERSION: 14.3.0
GCC_MAJOR_VERSION: 15
GCC_VERSION: 15.2.0
jobs:
repo:
@@ -52,16 +52,16 @@ jobs:
needs: repo
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/gcc/**"
- uses: ./.github/actions/build_docker_image
if: steps.changed-files.outputs.any_changed == 'true'
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
@@ -69,7 +69,7 @@ jobs:
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
rippleci/clio_gcc
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/gcc
tags: |
@@ -81,7 +81,7 @@ jobs:
build_args: |
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: rippleci/clio_gcc
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
dockerhub_description: GCC compiler for XRPLF/clio.
gcc-arm64:
@@ -90,7 +90,7 @@ jobs:
needs: repo
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
@@ -98,8 +98,8 @@ jobs:
with:
files: "docker/compilers/gcc/**"
- uses: ./.github/actions/build_docker_image
if: steps.changed-files.outputs.any_changed == 'true'
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
@@ -107,7 +107,7 @@ jobs:
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
rippleci/clio_gcc
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/gcc
tags: |
@@ -119,7 +119,7 @@ jobs:
build_args: |
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: rippleci/clio_gcc
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
dockerhub_description: GCC compiler for XRPLF/clio.
gcc-merge:
@@ -128,44 +128,50 @@ jobs:
needs: [repo, gcc-amd64, gcc-arm64]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/gcc/**"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Login to GitHub Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USER }}
password: ${{ secrets.DOCKERHUB_PW }}
- name: Create and push multi-arch manifest
if: github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true'
if: ${{ github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true' }}
run: |
for image in ${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc rippleci/clio_gcc; do
push_image() {
image=$1
docker buildx imagetools create \
-t $image:latest \
-t $image:${{ env.GCC_MAJOR_VERSION }} \
-t $image:${{ env.GCC_VERSION }} \
-t $image:${{ github.sha }} \
$image:arm64-latest \
$image:amd64-latest
done
-t $image:latest \
-t $image:${{ env.GCC_MAJOR_VERSION }} \
-t $image:${{ env.GCC_VERSION }} \
-t $image:${{ github.sha }} \
$image:arm64-latest \
$image:amd64-latest
}
push_image ${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
if [[ ${{ github.repository_owner }} == 'XRPLF' ]]; then
push_image rippleci/clio_clang
fi
clang:
name: Build and push Clang docker image
@@ -173,16 +179,16 @@ jobs:
needs: repo
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/clang/**"
- uses: ./.github/actions/build_docker_image
if: steps.changed-files.outputs.any_changed == 'true'
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
@@ -190,7 +196,7 @@ jobs:
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-clang
rippleci/clio_clang
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_clang' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/clang
tags: |
@@ -200,7 +206,7 @@ jobs:
platforms: linux/amd64,linux/arm64
build_args: |
CLANG_MAJOR_VERSION=${{ env.CLANG_MAJOR_VERSION }}
dockerhub_repo: rippleci/clio_clang
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_clang' || '' }}
dockerhub_description: Clang compiler for XRPLF/clio.
tools-amd64:
@@ -209,16 +215,16 @@ jobs:
needs: [repo, gcc-merge]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/tools/**"
- uses: ./.github/actions/build_docker_image
if: steps.changed-files.outputs.any_changed == 'true'
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
@@ -240,7 +246,7 @@ jobs:
needs: [repo, gcc-merge]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
@@ -248,8 +254,8 @@ jobs:
with:
files: "docker/tools/**"
- uses: ./.github/actions/build_docker_image
if: steps.changed-files.outputs.any_changed == 'true'
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
@@ -271,27 +277,27 @@ jobs:
needs: [repo, tools-amd64, tools-arm64]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/tools/**"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Login to GitHub Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1 # v3.5.0
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Create and push multi-arch manifest
if: github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true'
if: ${{ github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true' }}
run: |
image=${{ needs.repo.outputs.GHCR_REPO }}/clio-tools
docker buildx imagetools create \
@@ -300,14 +306,36 @@ jobs:
$image:arm64-latest \
$image:amd64-latest
pre-commit:
name: Build and push pre-commit docker image
runs-on: heavy
needs: [repo, tools-merge]
steps:
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-pre-commit
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/pre-commit
tags: |
type=raw,value=latest
type=raw,value=${{ github.sha }}
platforms: linux/amd64,linux/arm64
build_args: |
GHCR_REPO=${{ needs.repo.outputs.GHCR_REPO }}
ci:
name: Build and push CI docker image
runs-on: heavy
needs: [repo, gcc-merge, clang, tools-merge]
steps:
- uses: actions/checkout@v4
- uses: ./.github/actions/build_docker_image
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
@@ -315,7 +343,7 @@ jobs:
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-ci
rippleci/clio_ci
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_ci' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/ci
tags: |
@@ -328,5 +356,5 @@ jobs:
CLANG_MAJOR_VERSION=${{ env.CLANG_MAJOR_VERSION }}
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: rippleci/clio_ci
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_ci' || '' }}
dockerhub_description: CI image for XRPLF/clio.

View File

@@ -18,24 +18,20 @@ on:
pull_request:
branches: [develop]
paths:
- .github/workflows/upload_conan_deps.yml
- .github/workflows/upload-conan-deps.yml
- .github/actions/generate/action.yml
- .github/actions/prepare_runner/action.yml
- .github/actions/conan/action.yml
- ".github/scripts/conan/**"
- "!.github/scripts/conan/apple-clang-local.profile"
- conanfile.py
- conan.lock
push:
branches: [develop]
paths:
- .github/workflows/upload_conan_deps.yml
- .github/workflows/upload-conan-deps.yml
- .github/actions/generate/action.yml
- .github/actions/prepare_runner/action.yml
- .github/actions/conan/action.yml
- ".github/scripts/conan/**"
- "!.github/scripts/conan/apple-clang-local.profile"
- conanfile.py
- conan.lock
@@ -50,7 +46,7 @@ jobs:
outputs:
matrix: ${{ steps.set-matrix.outputs.matrix }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Calculate conan matrix
id: set-matrix
@@ -64,6 +60,7 @@ jobs:
strategy:
fail-fast: false
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
runs-on: ${{ matrix.os }}
container: ${{ matrix.container != '' && fromJson(matrix.container) || null }}
@@ -72,23 +69,23 @@ jobs:
CONAN_PROFILE: ${{ matrix.compiler }}${{ matrix.sanitizer_ext }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Setup conan on macOS
if: runner.os == 'macOS'
if: ${{ runner.os == 'macOS' }}
shell: bash
run: ./.github/scripts/conan/init.sh
- name: Show conan profile
run: conan profile show --profile:all ${{ env.CONAN_PROFILE }}
- name: Run conan and cmake
uses: ./.github/actions/generate
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
# We check that everything builds fine from source on scheduled runs
@@ -97,9 +94,11 @@ jobs:
build_type: ${{ matrix.build_type }}
- name: Login to Conan
if: github.event_name != 'pull_request'
run: conan remote login -p ${{ secrets.CONAN_PASSWORD }} ripple ${{ secrets.CONAN_USERNAME }}
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
run: conan remote login -p ${{ secrets.CONAN_PASSWORD }} xrplf ${{ secrets.CONAN_USERNAME }}
- name: Upload Conan packages
if: github.event_name != 'pull_request' && github.event_name != 'schedule'
run: conan upload "*" -r=ripple --confirm ${{ github.event.inputs.force_upload == 'true' && '--force' || '' }}
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' && github.event_name != 'schedule' }}
env:
FORCE_OPTION: ${{ github.event.inputs.force_upload == 'true' && '--force' || '' }}
run: conan upload "*" -r=xrplf --confirm ${FORCE_OPTION}

View File

@@ -16,7 +16,7 @@ exclude: ^(docs/doxygen-awesome-theme/|conan\.lock$)
repos:
# `pre-commit sample-config` default hooks
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: cef0300fd0fc4d2a87a85fa2093c6b283ea36f4b # frozen: v5.0.0
rev: 3e8a8703264a2f4a69428a0aa4dcb512790b2c8c # frozen: v6.0.0
hooks:
- id: check-added-large-files
- id: check-executables-have-shebangs
@@ -37,13 +37,13 @@ repos:
exclude: LICENSE.md
- repo: https://github.com/hadolint/hadolint
rev: c3dc18df7a501f02a560a2cc7ba3c69a85ca01d3 # frozen: v2.13.1-beta
rev: 4e697ba704fd23b2409b947a319c19c3ee54d24f # frozen: v2.14.0
hooks:
- id: hadolint-docker
# hadolint-docker is a special hook that runs hadolint in a Docker container
# Docker is not installed in the environment where pre-commit is run
stages: [manual]
entry: hadolint/hadolint:v2.12.1-beta hadolint
entry: hadolint/hadolint:v2.14.0 hadolint
- repo: https://github.com/codespell-project/codespell
rev: 63c8f8312b7559622c0d82815639671ae42132ac # frozen: v2.4.1
@@ -55,12 +55,6 @@ repos:
--ignore-words=pre-commit-hooks/codespell_ignore.txt,
]
- repo: https://github.com/trufflesecurity/trufflehog
rev: a05cf0859455b5b16317ee22d809887a4043cdf0 # frozen: v3.90.2
hooks:
- id: trufflehog
entry: trufflehog git file://. --since-commit HEAD --max-depth=1 --no-verification --fail
# Running some C++ hooks before clang-format
# to ensure that the style is consistent.
- repo: local
@@ -86,7 +80,7 @@ repos:
language: script
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: 182152eb8c5ce1cf5299b956b04392c86bd8a126 # frozen: v20.1.8
rev: 719856d56a62953b8d2839fb9e851f25c3cfeef8 # frozen: v21.1.2
hooks:
- id: clang-format
args: [--style=file]

View File

@@ -11,12 +11,11 @@ option(integration_tests "Build integration tests" FALSE)
option(benchmark "Build benchmarks" FALSE)
option(docs "Generate doxygen docs" FALSE)
option(coverage "Build test coverage report" FALSE)
option(packaging "Create distribution packages" FALSE)
option(package "Create distribution packages" FALSE)
option(lint "Run clang-tidy checks during compilation" FALSE)
option(static "Statically linked Clio" FALSE)
option(snapshot "Build snapshot tool" FALSE)
option(time_trace "Build using -ftime-trace to create compiler trace reports" FALSE)
option(use_mold "Use mold linker" FALSE)
# ========================================================================== #
set(san "" CACHE STRING "Add sanitizer instrumentation")
@@ -40,11 +39,6 @@ if (verbose)
set(CMAKE_VERBOSE_MAKEFILE TRUE)
endif ()
if (packaging)
add_definitions(-DPKG=1)
target_compile_definitions(clio_options INTERFACE PKG=1)
endif ()
# Clio tweaks and checks
include(CheckCompiler)
include(Settings)
@@ -58,6 +52,7 @@ include(deps/Threads)
include(deps/libfmt)
include(deps/cassandra)
include(deps/libbacktrace)
include(deps/spdlog)
add_subdirectory(src)
add_subdirectory(tests)
@@ -93,8 +88,8 @@ if (docs)
endif ()
include(install/install)
if (packaging)
include(cmake/packaging.cmake) # This file exists only in build runner
if (package)
include(ClioPackage)
endif ()
if (snapshot)

View File

@@ -34,7 +34,6 @@ Below are some useful docs to learn more about Clio.
- [How to configure Clio and rippled](./docs/configure-clio.md)
- [How to run Clio](./docs/run-clio.md)
- [Logging](./docs/logging.md)
- [Troubleshooting guide](./docs/trouble_shooting.md)
**General reference material:**

View File

@@ -14,5 +14,5 @@ target_sources(
include(deps/gbench)
target_include_directories(clio_benchmark PRIVATE .)
target_link_libraries(clio_benchmark PUBLIC clio_util benchmark::benchmark_main)
target_link_libraries(clio_benchmark PUBLIC clio_util benchmark::benchmark_main spdlog::spdlog)
set_target_properties(clio_benchmark PROPERTIES RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR})

View File

@@ -22,26 +22,36 @@
#include "util/prometheus/Prometheus.hpp"
#include <benchmark/benchmark.h>
#include <boost/log/core/core.hpp>
#include <boost/log/utility/setup/common_attributes.hpp>
#include <fmt/format.h>
#include <spdlog/async.h>
#include <spdlog/async_logger.h>
#include <spdlog/spdlog.h>
#include <barrier>
#include <chrono>
#include <cstddef>
#include <filesystem>
#include <memory>
#include <string>
#include <thread>
#include <utility>
#include <vector>
using namespace util;
struct BenchmarkLoggingInitializer {
static constexpr auto kLOG_FORMAT = "%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%";
static constexpr auto kLOG_FORMAT = "%Y-%m-%d %H:%M:%S.%f %^%3!l:%n%$ - %v";
static void
initFileLogging(LogService::FileLoggingParams const& params)
struct BenchmarkLoggingInitializer {
[[nodiscard]] static std::shared_ptr<spdlog::sinks::sink>
createFileSink(LogService::FileLoggingParams const& params)
{
LogService::initFileLogging(params, kLOG_FORMAT);
return LogService::createFileSink(params, kLOG_FORMAT);
}
static Logger
getLogger(std::shared_ptr<spdlog::logger> logger)
{
return Logger(std::move(logger));
}
};
@@ -53,7 +63,7 @@ uniqueLogDir()
auto const epochTime = std::chrono::high_resolution_clock::now().time_since_epoch();
auto const tmpDir = std::filesystem::temp_directory_path();
std::string const dirName =
"logs_" + std::to_string(std::chrono::duration_cast<std::chrono::microseconds>(epochTime).count());
fmt::format("logs_{}", std::chrono::duration_cast<std::chrono::microseconds>(epochTime).count());
return tmpDir / "clio_benchmark" / dirName;
}
@@ -62,8 +72,6 @@ uniqueLogDir()
static void
benchmarkConcurrentFileLogging(benchmark::State& state)
{
boost::log::add_common_attributes();
auto const numThreads = static_cast<size_t>(state.range(0));
auto const messagesPerThread = static_cast<size_t>(state.range(1));
@@ -74,12 +82,14 @@ benchmarkConcurrentFileLogging(benchmark::State& state)
state.PauseTiming();
std::filesystem::create_directories(logDir);
static constexpr size_t kQUEUE_SIZE = 8192;
static constexpr size_t kTHREAD_COUNT = 1;
spdlog::init_thread_pool(kQUEUE_SIZE, kTHREAD_COUNT);
BenchmarkLoggingInitializer::initFileLogging({
auto fileSink = BenchmarkLoggingInitializer::createFileSink({
.logDir = logDir,
.rotationSizeMB = 5,
.dirMaxSizeMB = 125,
.rotationHours = 24,
.dirMaxFiles = 25,
});
std::vector<std::thread> threads;
@@ -92,10 +102,15 @@ benchmarkConcurrentFileLogging(benchmark::State& state)
});
for (size_t threadNum = 0; threadNum < numThreads; ++threadNum) {
threads.emplace_back([threadNum, messagesPerThread, &barrier]() {
barrier.arrive_and_wait();
threads.emplace_back([threadNum, messagesPerThread, fileSink, &barrier]() {
std::string const channel = fmt::format("Thread_{}", threadNum);
auto logger = std::make_shared<spdlog::async_logger>(
channel, fileSink, spdlog::thread_pool(), spdlog::async_overflow_policy::block
);
spdlog::register_logger(logger);
Logger const threadLogger = BenchmarkLoggingInitializer::getLogger(std::move(logger));
Logger const threadLogger("Thread_" + std::to_string(threadNum));
barrier.arrive_and_wait();
for (size_t messageNum = 0; messageNum < messagesPerThread; ++messageNum) {
LOG(threadLogger.info()) << "Test log message #" << messageNum;
@@ -106,10 +121,9 @@ benchmarkConcurrentFileLogging(benchmark::State& state)
for (auto& thread : threads) {
thread.join();
}
boost::log::core::get()->flush();
spdlog::shutdown();
auto const end = std::chrono::high_resolution_clock::now();
state.SetIterationTime(std::chrono::duration_cast<std::chrono::duration<double>>(end - start).count());
std::filesystem::remove_all(logDir);
@@ -129,7 +143,7 @@ BENCHMARK(benchmarkConcurrentFileLogging)
// Number of threads
{1, 2, 4, 8},
// Messages per thread
{10'000, 100'000, 500'000},
{10'000, 100'000, 500'000, 1'000'000, 10'000'000},
})
->UseManualTime()
->Unit(benchmark::kMillisecond);

8
cmake/ClioPackage.cmake Normal file
View File

@@ -0,0 +1,8 @@
include("${CMAKE_CURRENT_LIST_DIR}/ClioVersion.cmake")
set(CPACK_PACKAGING_INSTALL_PREFIX "/opt/clio")
set(CPACK_PACKAGE_VERSION "${CLIO_VERSION}")
set(CPACK_STRIP_FILES TRUE)
include(pkg/deb)
include(CPack)

View File

@@ -6,15 +6,17 @@ execute_process(
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
OUTPUT_VARIABLE TAG
RESULT_VARIABLE RC
OUTPUT_STRIP_TRAILING_WHITESPACE
ERROR_VARIABLE ERR
OUTPUT_STRIP_TRAILING_WHITESPACE ERROR_STRIP_TRAILING_WHITESPACE
)
if (RC EQUAL 0)
# if we are on a tag, use the tag name
message(STATUS "Found tag '${TAG}' in git. Will use it as Clio version")
set(CLIO_VERSION "${TAG}")
set(DOC_CLIO_VERSION "${TAG}")
else ()
# if not, use YYYYMMDDHMS-<branch>-<git-rev>
message(STATUS "Error finding tag in git: ${ERR}")
message(STATUS "Will use 'YYYYMMDDHMS-<branch>-<git-rev>' as Clio version")
set(GIT_COMMAND show -s --date=format:%Y%m%d%H%M%S --format=%cd)
execute_process(

View File

@@ -1,8 +1,11 @@
if (use_mold)
if (CMAKE_SYSTEM_NAME STREQUAL "Linux")
message(STATUS "Using Mold linker")
set(CMAKE_LINKER_TYPE MOLD)
else ()
message(FATAL_ERROR "Mold linker is only supported on Linux.")
endif ()
if (DEFINED CMAKE_LINKER_TYPE)
message(STATUS "Custom linker is already set: ${CMAKE_LINKER_TYPE}")
return()
endif ()
find_program(MOLD_PATH mold)
if (MOLD_PATH AND CMAKE_SYSTEM_NAME STREQUAL "Linux")
message(STATUS "Using Mold linker: ${MOLD_PATH}")
set(CMAKE_LINKER_TYPE MOLD)
endif ()

5
cmake/deps/spdlog.cmake Normal file
View File

@@ -0,0 +1,5 @@
find_package(spdlog REQUIRED)
if (NOT TARGET spdlog::spdlog)
message(FATAL_ERROR "spdlog::spdlog target not found")
endif ()

View File

@@ -1,17 +0,0 @@
[Unit]
Description=Clio XRPL API server
Documentation=https://github.com/XRPLF/clio.git
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
ExecStart=@CLIO_INSTALL_DIR@/bin/clio_server @CLIO_INSTALL_DIR@/etc/config.json
Restart=on-failure
User=clio
Group=clio
LimitNOFILE=65536
[Install]
WantedBy=multi-user.target

View File

@@ -1,13 +1,13 @@
set(CLIO_INSTALL_DIR "/opt/clio")
set(CMAKE_INSTALL_PREFIX ${CLIO_INSTALL_DIR})
set(CMAKE_INSTALL_PREFIX "${CLIO_INSTALL_DIR}" CACHE PATH "Install prefix" FORCE)
install(TARGETS clio_server DESTINATION bin)
set(CPACK_PACKAGING_INSTALL_PREFIX "${CMAKE_INSTALL_PREFIX}")
include(GNUInstallDirs)
install(TARGETS clio_server DESTINATION "${CMAKE_INSTALL_BINDIR}")
file(READ docs/examples/config/example-config.json config)
string(REGEX REPLACE "./clio_log" "/var/log/clio/" config "${config}")
file(WRITE ${CMAKE_BINARY_DIR}/install-config.json "${config}")
install(FILES ${CMAKE_BINARY_DIR}/install-config.json DESTINATION etc RENAME config.json)
configure_file("${CMAKE_SOURCE_DIR}/cmake/install/clio.service.in" "${CMAKE_BINARY_DIR}/clio.service")
install(FILES "${CMAKE_BINARY_DIR}/clio.service" DESTINATION /lib/systemd/system)

12
cmake/pkg/deb.cmake Normal file
View File

@@ -0,0 +1,12 @@
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_HOMEPAGE "https://github.com/XRPLF/clio")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "Ripple Labs Inc. <support@ripple.com>")
set(CPACK_DEBIAN_FILE_NAME DEB-DEFAULT)
set(CPACK_DEBIAN_PACKAGE_SHLIBDEPS ON)
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA ${CMAKE_SOURCE_DIR}/cmake/pkg/postinst)
# We must replace "-" with "~" otherwise dpkg will sort "X.Y.Z-b1" as greater than "X.Y.Z"
string(REPLACE "-" "~" git "${CPACK_PACKAGE_VERSION}")

46
cmake/pkg/postinst Executable file
View File

@@ -0,0 +1,46 @@
#!/bin/sh
set -e
USER_NAME=clio
GROUP_NAME="${USER_NAME}"
CLIO_EXECUTABLE="clio_server"
CLIO_PREFIX="/opt/clio"
CLIO_BIN="$CLIO_PREFIX/bin/${CLIO_EXECUTABLE}"
CLIO_CONFIG="$CLIO_PREFIX/etc/config.json"
case "$1" in
configure)
if ! id -u "$USER_NAME" >/dev/null 2>&1; then
# Users who should not have a home directory should have their home directory set to /nonexistent
# https://www.debian.org/doc/debian-policy/ch-opersys.html#non-existent-home-directories
useradd \
--system \
--home-dir /nonexistent \
--no-create-home \
--shell /usr/sbin/nologin \
--comment "system user for ${CLIO_EXECUTABLE}" \
--user-group \
${USER_NAME}
fi
install -d -o "$USER_NAME" -g "$GROUP_NAME" /var/log/clio
if [ -f "$CLIO_CONFIG" ]; then
chown "$USER_NAME:$GROUP_NAME" "$CLIO_CONFIG"
fi
chown -R "$USER_NAME:$GROUP_NAME" "$CLIO_PREFIX"
ln -sf "$CLIO_BIN" "/usr/bin/${CLIO_EXECUTABLE}"
;;
abort-upgrade|abort-remove|abort-deconfigure)
;;
*)
echo "postinst called with unknown argument \`$1'" >&2
exit 1
;;
esac
exit 0

View File

@@ -1,40 +1,41 @@
{
"version": "0.5",
"requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1754412218.488",
"xxhash/0.8.2#7856c968c985b2981b707ee8f2413b2b%1754325010.01",
"xrpl/2.5.0@clio/boost-odr#f68e48da1490c0a583052e4f068ada55%1754325014.392",
"sqlite3/3.47.0#7a0904fd061f5f8a2366c294f9387830%1754325009.708",
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1754412158.144",
"re2/20230301#dfd6e2bf050eb90ddd8729cfb4c844a4%1754412148.014",
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
"xxhash/0.8.3#681d36a0a6111fc56e5e45ea182c19cc%1756234289.683",
"xrpl/2.6.1#973af2bf9631f239941dd9f5a100bb84%1759275059.342",
"sqlite3/3.49.1#8631739a4c9b93bd3d6b753bac548a63%1756234266.869",
"spdlog/1.15.3#3ca0e9e6b83af4d0151e26541d140c86%1754401846.61",
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1756234262.318",
"re2/20230301#dfd6e2bf050eb90ddd8729cfb4c844a4%1756234257.976",
"rapidjson/cci.20220822#1b9d8c2256876a154172dc5cfbe447c6%1754325007.656",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1754412120.055",
"openssl/1.1.1v#216374e4fb5b2e0f5ab1fb6f27b5b434%1754325006.553",
"nudb/2.0.8#63990d3e517038e04bf529eb8167f69f%1754325004.398",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"openssl/1.1.1w#a8f0792d7c5121b954578a7149d23e03%1756223730.729",
"nudb/2.0.9#c62cfd501e57055a7e0d8ee3d5e5427d%1756234237.107",
"minizip/1.2.13#9e87d57804bd372d6d1e32b1871517a3%1754325004.374",
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1754412069.24",
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1756234228.999",
"libuv/1.46.0#dc28c1f653fa197f00db5b577a6f6011%1754325003.592",
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1754325002.385",
"libbacktrace/cci.20210118#a7691bfccd8caaf66309df196790a5a1%1754410401.723",
"libarchive/3.7.6#7e902bb4ac1d20504fb330c0dd040192%1754325001.673",
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1756223727.64",
"libbacktrace/cci.20210118#a7691bfccd8caaf66309df196790a5a1%1756230911.03",
"libarchive/3.8.1#5cf685686322e906cb42706ab7e099a8%1756234256.696",
"http_parser/2.9.4#98d91690d6fd021e9e624218a85d9d97%1754325001.385",
"gtest/1.14.0#f8f0757a574a8dd747d16af62d6eb1b7%1754325000.842",
"grpc/1.50.1#02291451d1e17200293a409410d1c4e1%1754412166.358",
"grpc/1.50.1#02291451d1e17200293a409410d1c4e1%1756234248.958",
"fmt/11.2.0#579bb2cdf4a7607621beea4eb4651e0f%1754324999.086",
"date/3.0.3#cf28fe9c0aab99fe12da08aa42df65e1%1754324996.727",
"doctest/2.4.11#a4211dfc329a16ba9f280f9574025659%1756234220.819",
"date/3.0.4#f74bbba5a08fa388256688743136cb6f%1756234217.493",
"cassandra-cpp-driver/2.17.0#e50919efac8418c26be6671fd702540a%1754324997.363",
"c-ares/1.34.5#b78b91e7cfb1f11ce777a285bbf169c6%1754412042.679",
"bzip2/1.0.8#00b4a4658791c1f06914e087f0e792f5%1754412214.559",
"c-ares/1.34.5#b78b91e7cfb1f11ce777a285bbf169c6%1756234217.915",
"bzip2/1.0.8#00b4a4658791c1f06914e087f0e792f5%1756234261.716",
"boost/1.83.0#5d975011d65b51abb2d2f6eb8386b368%1754325043.336",
"benchmark/1.9.4#ce4403f7a24d3e1f907cd9da4b678be4%1749892625.885",
"abseil/20230802.1#f0f91485b111dc9837a68972cb19ca7b%1754412054.336"
"benchmark/1.9.4#ce4403f7a24d3e1f907cd9da4b678be4%1754578869.672",
"abseil/20230802.1#f0f91485b111dc9837a68972cb19ca7b%1756234220.907"
],
"build_requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1754412218.488",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1754412120.055",
"protobuf/3.21.9#64ce20e1d9ea24f3d6c504015d5f6fa8%1754325048.831",
"cmake/3.31.8#dde3bde00bb843687e55aea5afa0e220%1754412060.968",
"b2/5.3.3#107c15377719889654eb9a162a673975%1754412065.321"
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"cmake/3.31.8#dde3bde00bb843687e55aea5afa0e220%1756234232.89",
"b2/5.3.3#107c15377719889654eb9a162a673975%1756234226.28"
],
"python_requires": [],
"overrides": {
@@ -42,7 +43,7 @@
null,
"boost/1.83.0#5d975011d65b51abb2d2f6eb8386b368"
],
"protobuf/3.21.9": [
"protobuf/3.21.12": [
null,
"protobuf/3.21.12"
],
@@ -50,7 +51,7 @@
"lz4/1.10.0"
],
"sqlite3/3.44.2": [
"sqlite3/3.47.0"
"sqlite3/3.49.1"
]
},
"config_requires": []

View File

@@ -9,20 +9,7 @@ class ClioConan(ConanFile):
url = 'https://github.com/xrplf/clio'
description = 'Clio RPC server'
settings = 'os', 'compiler', 'build_type', 'arch'
options = {
'static': [True, False], # static linkage
'verbose': [True, False],
'tests': [True, False], # build unit tests; create `clio_tests` binary
'integration_tests': [True, False], # build integration tests; create `clio_integration_tests` binary
'benchmark': [True, False], # build benchmarks; create `clio_benchmarks` binary
'docs': [True, False], # doxygen API docs; create custom target 'docs'
'packaging': [True, False], # create distribution packages
'coverage': [True, False], # build for test coverage report; create custom target `clio_tests-ccov`
'lint': [True, False], # run clang-tidy checks during compilation
'snapshot': [True, False], # build export/import snapshot tool
'time_trace': [True, False], # build using -ftime-trace to create compiler trace reports
'use_mold': [True, False], # use mold linker for faster linking
}
options = {}
requires = [
'boost/1.83.0',
@@ -30,26 +17,14 @@ class ClioConan(ConanFile):
'fmt/11.2.0',
'protobuf/3.21.12',
'grpc/1.50.1',
'openssl/1.1.1v',
'xrpl/2.5.0@clio/boost-odr',
'openssl/1.1.1w',
'xrpl/2.6.1',
'zlib/1.3.1',
'libbacktrace/cci.20210118'
'libbacktrace/cci.20210118',
'spdlog/1.15.3',
]
default_options = {
'static': False,
'verbose': False,
'tests': False,
'integration_tests': False,
'benchmark': False,
'packaging': False,
'coverage': False,
'lint': False,
'docs': False,
'snapshot': False,
'time_trace': False,
'use_mold': False,
'xrpl/*:tests': False,
'xrpl/*:rocksdb': False,
'cassandra-cpp-driver/*:shared': False,
@@ -70,10 +45,8 @@ class ClioConan(ConanFile):
)
def requirements(self):
if self.options.tests or self.options.integration_tests:
self.requires('gtest/1.14.0')
if self.options.benchmark:
self.requires('benchmark/1.9.4')
self.requires('gtest/1.14.0')
self.requires('benchmark/1.9.4')
def configure(self):
if self.settings.compiler == 'apple-clang':
@@ -89,8 +62,6 @@ class ClioConan(ConanFile):
def generate(self):
tc = CMakeToolchain(self)
for option_name, option_value in self.options.items():
tc.variables[option_name] = option_value
tc.generate()
def build(self):

View File

@@ -20,43 +20,54 @@ SHELL ["/bin/bash", "-o", "pipefail", "-c"]
USER root
WORKDIR /root
ARG LLVM_TOOLS_VERSION=20
# Add repositories
# Install common tools and dependencies
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
curl \
dpkg-dev \
file \
git \
git-lfs \
gnupg \
graphviz \
jq \
# libgmp, libmpfr and libncurses are gdb dependencies
libgmp-dev \
libmpfr-dev \
libncurses-dev \
make \
ninja-build \
wget \
zip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& echo "deb http://apt.llvm.org/focal/ llvm-toolchain-focal-${LLVM_TOOLS_VERSION} main" >> /etc/apt/sources.list \
&& rm -rf /var/lib/apt/lists/*
# Install Python tools
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
python3 \
python3-pip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
RUN pip install -q --no-cache-dir \
# TODO: Remove this once we switch to newer Ubuntu base image
# lxml 6.0.0 is not compatible with our image
'lxml<6.0.0' \
cmake \
conan==2.20.1 \
gcovr
# Install LLVM tools
ARG LLVM_TOOLS_VERSION=20
RUN echo "deb http://apt.llvm.org/focal/ llvm-toolchain-focal-${LLVM_TOOLS_VERSION} main" >> /etc/apt/sources.list \
&& wget --progress=dot:giga -O - https://apt.llvm.org/llvm-snapshot.gpg.key | apt-key add -
# Install packages
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
clang-tidy-${LLVM_TOOLS_VERSION} \
clang-tools-${LLVM_TOOLS_VERSION} \
git \
git-lfs \
graphviz \
jq \
make \
ninja-build \
python3 \
python3-pip \
zip \
&& pip3 install -q --upgrade --no-cache-dir pip \
&& pip3 install -q --no-cache-dir \
# TODO: Remove this once we switch to newer Ubuntu base image
# lxml 6.0.0 is not compatible with our image
'lxml<6.0.0' \
\
cmake \
conan==2.17.0 \
gcovr \
pre-commit \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
@@ -92,12 +103,13 @@ COPY --from=clio-tools \
/usr/local/bin/ClangBuildAnalyzer \
/usr/local/bin/git-cliff \
/usr/local/bin/gh \
/usr/local/bin/gdb \
/usr/local/bin/
WORKDIR /root
# Setup conan
RUN conan remote add --index 0 ripple https://conan.ripplex.io
RUN conan remote add --index 0 xrplf https://conan.ripplex.io
WORKDIR /root/.conan2
COPY conan/global.conf ./global.conf

View File

@@ -8,12 +8,14 @@ The image is based on Ubuntu 20.04 and contains:
- ccache 4.11.3
- Clang 19
- ClangBuildAnalyzer 1.6.0
- Conan 2.17.0
- Doxygen 1.12
- GCC 14.3.0
- Conan 2.20.1
- Doxygen 1.14
- GCC 15.2.0
- GDB 16.3
- gh 2.74
- git-cliff 2.9.1
- mold 2.40.1
- Python 3.13
- and some other useful tools
Conan is set up to build Clio without any additional steps.

View File

@@ -4,8 +4,8 @@ build_type=Release
compiler=gcc
compiler.cppstd=20
compiler.libcxx=libstdc++11
compiler.version=14
compiler.version=15
os=Linux
[conf]
tools.build:compiler_executables={"c": "/usr/bin/gcc-14", "cpp": "/usr/bin/g++-14"}
tools.build:compiler_executables={"c": "/usr/bin/gcc-15", "cpp": "/usr/bin/g++-15"}

View File

@@ -13,8 +13,8 @@ Clio repository provides an [example](https://github.com/XRPLF/clio/blob/develop
Config file recommendations:
- Set `log_to_console` to `false` if you want to avoid logs being written to `stdout`.
- Set `log_directory` to `/opt/clio/log` to store logs in a volume.
- Set `log.enable_console` to `false` if you want to avoid logs being written to `stdout`.
- Set `log.directory` to `/opt/clio/log` to store logs in a volume.
## Usage

View File

@@ -18,7 +18,7 @@ RUN apt-get update \
ARG CLANG_MAJOR_VERSION=invalid
# Bump this version to force rebuild of the image
ARG BUILD_VERSION=0
ARG BUILD_VERSION=1
RUN wget --progress=dot:giga https://apt.llvm.org/llvm.sh \
&& chmod +x llvm.sh \

View File

@@ -8,7 +8,7 @@ ARG UBUNTU_VERSION
ARG GCC_MAJOR_VERSION
ARG BUILD_VERSION=0
ARG BUILD_VERSION=1
ARG DEBIAN_FRONTEND=noninteractive
ARG TARGETARCH

View File

@@ -1,4 +1,4 @@
Package: gcc-14-ubuntu-UBUNTUVERSION
Package: gcc-15-ubuntu-UBUNTUVERSION
Version: VERSION
Architecture: TARGETARCH
Maintainer: Alex Kremer <akremer@ripple.com>

View File

@@ -1,6 +1,6 @@
services:
clio_develop:
image: ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
volumes:
- clio_develop_conan_data:/root/.conan2/p
- clio_develop_ccache:/root/.ccache

View File

@@ -0,0 +1,38 @@
ARG GHCR_REPO=invalid
FROM ${GHCR_REPO}/clio-tools:latest AS clio-tools
# We're using Ubuntu 24.04 to have a more recent version of Python
FROM ubuntu:24.04
ARG DEBIAN_FRONTEND=noninteractive
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
# hadolint ignore=DL3002
USER root
WORKDIR /root
# Install common tools and dependencies
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
curl \
git \
libatomic1 \
software-properties-common \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
# Install Python tools
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
python3 \
python3-pip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
RUN pip install -q --no-cache-dir --break-system-packages \
pre-commit
COPY --from=clio-tools \
/usr/local/bin/doxygen \
/usr/local/bin/

View File

@@ -8,12 +8,10 @@ ARG TARGETARCH
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
ARG BUILD_VERSION=1
ARG BUILD_VERSION=2
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
bison \
flex \
ninja-build \
python3 \
python3-pip \
@@ -46,7 +44,14 @@ RUN wget --progress=dot:giga "https://github.com/ccache/ccache/releases/download
&& ninja install \
&& rm -rf /tmp/* /var/tmp/*
ARG DOXYGEN_VERSION=1.12.0
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
bison \
flex \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
ARG DOXYGEN_VERSION=1.14.0
RUN wget --progress=dot:giga "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.src.tar.gz" \
&& tar xf "doxygen-${DOXYGEN_VERSION}.src.tar.gz" \
&& cd "doxygen-${DOXYGEN_VERSION}" \
@@ -78,4 +83,22 @@ RUN wget --progress=dot:giga "https://github.com/cli/cli/releases/download/v${GH
&& mv gh_${GH_VERSION}_linux_${TARGETARCH}/bin/gh /usr/local/bin/gh \
&& rm -rf /tmp/* /var/tmp/*
RUN apt-get update \
&& apt-get install -y --no-install-recommends --no-install-suggests \
libgmp-dev \
libmpfr-dev \
libncurses-dev \
make \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
ARG GDB_VERSION=16.3
RUN wget --progress=dot:giga "https://sourceware.org/pub/gdb/releases/gdb-${GDB_VERSION}.tar.gz" \
&& tar xf "gdb-${GDB_VERSION}.tar.gz" \
&& cd "gdb-${GDB_VERSION}" \
&& ./configure --prefix=/usr/local \
&& make -j "$(nproc)" \
&& make install-gdb \
&& rm -rf /tmp/* /var/tmp/*
WORKDIR /root

View File

@@ -3,6 +3,8 @@ PROJECT_LOGO = ${SOURCE}/docs/img/xrpl-logo.svg
PROJECT_NUMBER = ${DOC_CLIO_VERSION}
PROJECT_BRIEF = The XRP Ledger API server.
DOT_GRAPH_MAX_NODES = 100
EXTRACT_ALL = NO
EXTRACT_PRIVATE = NO
EXTRACT_PACKAGE = YES
@@ -13,6 +15,7 @@ EXTRACT_ANON_NSPACES = NO
SORT_MEMBERS_CTORS_1ST = YES
INPUT = ${SOURCE}/src
USE_MDFILE_AS_MAINPAGE = ${SOURCE}/src/README.md
EXCLUDE_SYMBOLS = ${EXCLUDES}
RECURSIVE = YES
HAVE_DOT = ${USE_DOT}
@@ -32,12 +35,18 @@ SORT_MEMBERS_CTORS_1ST = YES
GENERATE_TREEVIEW = YES
DISABLE_INDEX = NO
FULL_SIDEBAR = NO
HTML_HEADER = ${SOURCE}/docs/doxygen-awesome-theme/header.html
HTML_HEADER = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-custom/header.html
HTML_EXTRA_STYLESHEET = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only-darkmode-toggle.css
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only-darkmode-toggle.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-custom/custom.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-custom/theme-robot.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-custom/theme-round.css \
${SOURCE}/docs/github-corner-disable.css
HTML_EXTRA_FILES = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-darkmode-toggle.js \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-interactive-toc.js
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-fragment-copy-button.js \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-interactive-toc.js \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-custom/toggle-alternative-theme.js
HTML_COLORSTYLE = LIGHT
HTML_COLORSTYLE_HUE = 209

View File

@@ -6,16 +6,22 @@
## Minimum Requirements
- [Python 3.7](https://www.python.org/downloads/)
- [Conan 2.17.0](https://conan.io/downloads.html)
- [Conan 2.20.1](https://conan.io/downloads.html)
- [CMake 3.20](https://cmake.org/download/)
- [**Optional**] [GCovr](https://gcc.gnu.org/onlinedocs/gcc/Gcov.html): needed for code coverage generation
- [**Optional**] [CCache](https://ccache.dev/): speeds up compilation if you are going to compile Clio often
We use our Docker image `ghcr.io/XRPLF/clio-ci` to build `Clio`, see [Building Clio with Docker](#building-clio-with-docker).
You can find information about exact compiler versions and tools in the [image's README](https://github.com/XRPLF/clio/blob/develop/docker/ci/README.md).
The following compiler version are guaranteed to work.
Any compiler with lower version may not be able to build Clio:
| Compiler | Version |
| ----------- | ------- |
| GCC | 12.3 |
| Clang | 16 |
| Apple Clang | 15 |
| GCC | 15.2 |
| Clang | 19 |
| Apple Clang | 17 |
### Conan Configuration
@@ -84,7 +90,7 @@ core.upload:parallel={{os.cpu_count()}}
Make sure artifactory is setup with Conan.
```sh
conan remote add --index 0 ripple https://conan.ripplex.io
conan remote add --index 0 xrplf https://conan.ripplex.io
```
Now you should be able to download the prebuilt dependencies (including `xrpl` package) on supported platforms.
@@ -98,79 +104,100 @@ It is implicitly used when running `conan` commands, you don't need to specify i
You have to update this file every time you add a new dependency or change a revision or version of an existing dependency.
To do that, run the following command in the repository root:
> [!NOTE]
> Conan uses local cache by default when creating a lockfile.
>
> To ensure, that lockfile creation works the same way on all developer machines, you should clear the local cache before creating a new lockfile.
To create a new lockfile, run the following commands in the repository root:
```bash
conan lock create . -o '&:tests=True' -o '&:benchmark=True'
conan remove '*' --confirm
rm conan.lock
# This ensure that xrplf remote is the first to be consulted
conan remote add --force --index 0 xrplf https://conan.ripplex.io
conan lock create .
```
> [!NOTE]
> If some dependencies are exclusive for some OS, you may need to run the last command for them adding `--profile:all <PROFILE>`.
## Building Clio
Navigate to Clio's root directory and run:
1. Navigate to Clio's root directory and run:
```sh
mkdir build && cd build
# You can also specify profile explicitly by adding `--profile:all <PROFILE_NAME>`
conan install .. --output-folder . --build missing --settings build_type=Release -o '&:tests=True'
# You can also add -GNinja to use Ninja build system instead of Make
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build . --parallel 8 # or without the number if you feel extra adventurous
```
```sh
mkdir build && cd build
```
> [!TIP]
> You can omit the `-o '&:tests=True'` if you don't want to build `clio_tests`.
2. Install dependencies through conan.
If successful, `conan install` will find the required packages and `cmake` will do the rest. You should see `clio_server` and `clio_tests` in the `build` directory (the current directory).
```sh
conan install .. --output-folder . --build missing --settings build_type=Release
```
> [!TIP]
> To generate a Code Coverage report, include `-o '&:coverage=True'` in the `conan install` command above, along with `-o '&:tests=True'` to enable tests.
> After running the `cmake` commands, execute `make clio_tests-ccov`.
> The coverage report will be found at `clio_tests-llvm-cov/index.html`.
> You can add `--profile:all <PROFILE_NAME>` to choose a specific conan profile.
<!-- markdownlint-disable-line MD028 -->
3. Configure and generate build files with CMake.
```sh
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
```
> You can add `-GNinja` to use the Ninja build system (instead of Make).
4. Now, you can build all targets or specific ones:
```sh
# builds all targets
cmake --build . --parallel 8
# builds only clio_server target
cmake --build . --parallel 8 --target clio_server
```
You should see `clio_server` and `clio_tests` in the current directory.
> [!NOTE]
> If you've built Clio before and the build is now failing, it's likely due to updated dependencies. Try deleting the build folder and then rerunning the Conan and CMake commands mentioned above.
### CMake options
There are several CMake options you can use to customize the build:
| CMake Option | Default | CMake Target | Description |
| --------------------- | ------- | -------------------------------------------------------- | ------------------------------------- |
| `-Dcoverage` | OFF | `clio_tests-ccov` | Enables code coverage generation |
| `-Dtests` | OFF | `clio_tests` | Enables unit tests |
| `-Dintegration_tests` | OFF | `clio_integration_tests` | Enables integration tests |
| `-Dbenchmark` | OFF | `clio_benchmark` | Enables benchmark executable |
| `-Ddocs` | OFF | `docs` | Enables API documentation generation |
| `-Dlint` | OFF | See [#clang-tidy](#using-clang-tidy-for-static-analysis) | Enables `clang-tidy` static analysis |
| `-Dsan` | N/A | N/A | Enables Sanitizer (asan, tsan, ubsan) |
| `-Dpackage` | OFF | N/A | Creates a debian package |
### Generating API docs for Clio
The API documentation for Clio is generated by [Doxygen](https://www.doxygen.nl/index.html). If you want to generate the API documentation when building Clio, make sure to install Doxygen 1.12.0 on your system.
The API documentation for Clio is generated by [Doxygen](https://www.doxygen.nl/index.html). If you want to generate the API documentation when building Clio, make sure to install Doxygen 1.14.0 on your system.
To generate the API docs:
To generate the API docs, please use CMake option `-Ddocs=ON` as described above and build the `docs` target.
1. First, include `-o '&:docs=True'` in the conan install command. For example:
To view the generated files, go to `build/docs/html`.
Open the `index.html` file in your browser to see the documentation pages.
```sh
mkdir build && cd build
conan install .. --output-folder . --build missing --settings build_type=Release -o '&:tests=True' -o '&:docs=True'
```
2. Once that has completed successfully, run the `cmake` command and add the `--target docs` option:
```sh
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build . --parallel 8 --target docs
```
3. Go to `build/docs/html` to view the generated files.
Open the `index.html` file in your browser to see the documentation pages.
![API index page](./img/doxygen-docs-output.png "API index page")
![API index page](./img/doxygen-docs-output.png "API index page")
## Building Clio with Docker
It is also possible to build Clio using [Docker](https://www.docker.com/) if you don't want to install all the dependencies on your machine.
```sh
docker run -it ghcr.io/xrplf/clio-ci:a446d85297b3006e6d2c4dc7640368f096afecf5
docker run -it ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
git clone https://github.com/XRPLF/clio
mkdir build && cd build
conan install .. --output-folder . --build missing --settings build_type=Release -o '&:tests=True'
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build . --parallel 8 # or without the number if you feel extra adventurous
cd clio
```
Follow the same steps in the [Building Clio](#building-clio) section. You can use `--profile:all gcc` or `--profile:all clang` with the `conan install` command to choose the desired compiler.
## Developing against `rippled` in standalone mode
If you wish to develop against a `rippled` instance running in standalone mode there are a few quirks of both Clio and `rippled` that you need to keep in mind. You must:
@@ -223,10 +250,10 @@ Sometimes, during development, you need to build against a custom version of `li
## Using `clang-tidy` for static analysis
Clang-tidy can be run by CMake when building the project.
To achieve this, you just need to provide the option `-o '&:lint=True'` for the `conan install` command:
To achieve this, you just need to provide the option `-Dlint=ON` when generating CMake files:
```sh
conan install .. --output-folder . --build missing --settings build_type=Release -o '&:tests=True' -o '&:lint=True' --profile:all clang
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dlint=ON ..
```
By default CMake will try to find `clang-tidy` automatically in your system.

View File

@@ -3,7 +3,9 @@
This document provides a list of all available Clio configuration properties in detail.
> [!NOTE]
> Dot notation in configuration key names represents nested fields. For example, **database.scylladb** refers to the _scylladb_ field inside the _database_ object. If a key name includes "[]", it indicates that the nested field is an array (e.g., etl_sources.[]).
> Dot notation in configuration key names represents nested fields.
> For example, **database.scylladb** refers to the _scylladb_ field inside the _database_ object.
> If a key name includes "[]", it indicates that the nested field is an array (e.g., etl_sources.[]).
## Configuration Details
@@ -87,6 +89,14 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Represents the number of threads that will be used for database operations.
### database.cassandra.provider
- **Required**: True
- **Type**: string
- **Default value**: `cassandra`
- **Constraints**: The value must be one of the following: `cassandra`, `aws_keyspace`.
- **Description**: The specific database backend provider we are using.
### database.cassandra.core_connections_per_host
- **Required**: True
@@ -327,6 +337,22 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Maximum queue size for sending subscription data to clients. This queue buffers data when a client is slow to receive it, ensuring delivery once the client is ready.
### server.proxy.ips.[]
- **Required**: True
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: List of proxy ip addresses. When Clio receives a request from proxy it will use `Forwarded` value (if any) as client ip. When this option is used together with `server.proxy.tokens` Clio will identify proxy by ip or by token.
### server.proxy.tokens.[]
- **Required**: True
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: List of tokens in identifying request as a request from proxy. Token should be provided in `X-Proxy-Token` header, e.g. `X-Proxy-Token: <very_secret_token>'. When Clio receives a request from proxy it will use 'Forwarded` value (if any) to get client ip. When this option is used together with 'server.proxy.ips' Clio will identify proxy by ip or by token.
### prometheus.enabled
- **Required**: True
@@ -415,7 +441,7 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The value must be one of the following: `sync`, `async`, `none`.
- **Description**: The strategy used for Cache loading.
### log_channels.[].channel
### log.channels.[].channel
- **Required**: False
- **Type**: string
@@ -423,7 +449,7 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The value must be one of the following: `General`, `WebServer`, `Backend`, `RPC`, `ETL`, `Subscriptions`, `Performance`, `Migration`.
- **Description**: The name of the log channel.
### log_channels.[].log_level
### log.channels.[].level
- **Required**: False
- **Type**: string
@@ -431,7 +457,7 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The value must be one of the following: `trace`, `debug`, `info`, `warning`, `error`, `fatal`.
- **Description**: The log level for the specific log channel.
### log_level
### log.level
- **Required**: True
- **Type**: string
@@ -439,15 +465,39 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The value must be one of the following: `trace`, `debug`, `info`, `warning`, `error`, `fatal`.
- **Description**: The general logging level of Clio. This level is applied to all log channels that do not have an explicitly defined logging level.
### log_format
### log.format
- **Required**: True
- **Type**: string
- **Default value**: `%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%`
- **Default value**: `%Y-%m-%d %H:%M:%S.%f %^%3!l:%n%$ - %v`
- **Constraints**: None
- **Description**: The format string for log messages. The format is described here: <https://www.boost.org/doc/libs/1_83_0/libs/log/doc/html/log/tutorial/formatters.html>.
- **Description**: The format string for log messages using spdlog format patterns.
### log_to_console
Each of the variables expands like so:
- `%Y-%m-%d %H:%M:%S.%f`: The full date and time of the log entry with microsecond precision
- `%^`: Start color range
- `%3!l`: The severity (aka log level) the entry was sent at stripped to 3 characters
- `%n`: The logger name (channel) that this log entry was sent to
- `%$`: End color range
- `%v`: The actual log message
Some additional variables that might be useful:
- `%@`: A partial path to the C++ file and the line number in the said file (`src/file/path:linenumber`)
- `%t`: The ID of the thread the log entry is written from
Documentation can be found at: <https://github.com/gabime/spdlog/wiki/Custom-formatting>.
### log.is_async
- **Required**: True
- **Type**: boolean
- **Default value**: `True`
- **Constraints**: None
- **Description**: Whether spdlog is asynchronous or not.
### log.enable_console
- **Required**: True
- **Type**: boolean
@@ -455,7 +505,7 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: None
- **Description**: Enables or disables logging to the console.
### log_directory
### log.directory
- **Required**: False
- **Type**: string
@@ -463,7 +513,7 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: None
- **Description**: The directory path for the log files.
### log_rotation_size
### log.rotation_size
- **Required**: True
- **Type**: int
@@ -471,23 +521,15 @@ This document provides a list of all available Clio configuration properties in
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The log rotation size in megabytes. When the log file reaches this particular size, a new log file starts.
### log_directory_max_size
### log.directory_max_files
- **Required**: True
- **Type**: int
- **Default value**: `51200`
- **Default value**: `25`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum size of the log directory in megabytes.
- **Description**: The maximum number of log files in the directory.
### log_rotation_hour_interval
- **Required**: True
- **Type**: int
- **Default value**: `12`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Represents the interval (in hours) for log rotation. If the current log file reaches this value in logging, a new log file starts.
### log_tag_style
### log.tag_style
- **Required**: True
- **Type**: string

View File

@@ -88,13 +88,15 @@ Exactly equal password gains admin rights for the request or a websocket connect
Clio can cache requests to ETL sources to reduce the load on the ETL source.
Only following commands are cached: `server_info`, `server_state`, `server_definitions`, `fee`, `ledger_closed`.
By default the forwarding cache is off.
To enable the caching for a source, `forwarding_cache_timeout` value should be added to the configuration file, e.g.:
To enable the caching for a source, `forwarding.cache_timeout` value should be added to the configuration file, e.g.:
```json
"forwarding_cache_timeout": 0.250,
"forwarding": {
"cache_timeout": 0.250,
}
```
`forwarding_cache_timeout` defines for how long (in seconds) a cache entry will be valid after being placed into the cache.
`forwarding.cache_timeout` defines for how long (in seconds) a cache entry will be valid after being placed into the cache.
Zero value turns off the cache feature.
## Graceful shutdown (not fully implemented yet)

View File

@@ -1,29 +1,10 @@
// SPDX-License-Identifier: MIT
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
MIT License
Copyright (c) 2021 - 2023 jothepro
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2021 - 2025 jothepro
*/

View File

@@ -0,0 +1,66 @@
// SPDX-License-Identifier: MIT
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
Copyright (c) 2022 - 2025 jothepro
*/
class DoxygenAwesomeFragmentCopyButton extends HTMLElement {
constructor() {
super();
this.onclick=this.copyContent
}
static title = "Copy to clipboard"
static copyIcon = `<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" width="24" height="24"><path d="M0 0h24v24H0V0z" fill="none"/><path d="M23.04,10.322c0,-2.582 -2.096,-4.678 -4.678,-4.678l-6.918,-0c-2.582,-0 -4.678,2.096 -4.678,4.678c0,-0 0,8.04 0,8.04c0,2.582 2.096,4.678 4.678,4.678c0,-0 6.918,-0 6.918,-0c2.582,-0 4.678,-2.096 4.678,-4.678c0,-0 0,-8.04 0,-8.04Zm-2.438,-0l-0,8.04c-0,1.236 -1.004,2.24 -2.24,2.24l-6.918,-0c-1.236,-0 -2.239,-1.004 -2.239,-2.24l-0,-8.04c-0,-1.236 1.003,-2.24 2.239,-2.24c0,0 6.918,0 6.918,0c1.236,0 2.24,1.004 2.24,2.24Z"/><path d="M5.327,16.748c-0,0.358 -0.291,0.648 -0.649,0.648c0,0 0,0 0,0c-2.582,0 -4.678,-2.096 -4.678,-4.678c0,0 0,-8.04 0,-8.04c0,-2.582 2.096,-4.678 4.678,-4.678l6.918,0c2.168,0 3.994,1.478 4.523,3.481c0.038,0.149 0.005,0.306 -0.09,0.428c-0.094,0.121 -0.239,0.191 -0.392,0.191c-0.451,0.005 -1.057,0.005 -1.457,0.005c-0.238,0 -0.455,-0.14 -0.553,-0.357c-0.348,-0.773 -1.128,-1.31 -2.031,-1.31c-0,0 -6.918,0 -6.918,0c-1.236,0 -2.24,1.004 -2.24,2.24l0,8.04c0,1.236 1.004,2.24 2.24,2.24l0,-0c0.358,-0 0.649,0.29 0.649,0.648c-0,0.353 -0,0.789 -0,1.142Z" style="fill-opacity:0.6;"/></svg>`
static successIcon = `<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" width="24" height="24"><path d="M8.084,16.111c-0.09,0.09 -0.212,0.141 -0.34,0.141c-0.127,-0 -0.249,-0.051 -0.339,-0.141c-0.746,-0.746 -2.538,-2.538 -3.525,-3.525c-0.375,-0.375 -0.983,-0.375 -1.357,0c-0.178,0.178 -0.369,0.369 -0.547,0.547c-0.375,0.375 -0.375,0.982 -0,1.357c1.135,1.135 3.422,3.422 4.75,4.751c0.27,0.27 0.637,0.421 1.018,0.421c0.382,0 0.749,-0.151 1.019,-0.421c2.731,-2.732 10.166,-10.167 12.454,-12.455c0.375,-0.375 0.375,-0.982 -0,-1.357c-0.178,-0.178 -0.369,-0.369 -0.547,-0.547c-0.375,-0.375 -0.982,-0.375 -1.357,0c-2.273,2.273 -9.567,9.567 -11.229,11.229Z"/></svg>`
static successDuration = 980
static init() {
$(function() {
$(document).ready(function() {
if(navigator.clipboard) {
const fragments = document.getElementsByClassName("fragment")
for(const fragment of fragments) {
const fragmentWrapper = document.createElement("div")
fragmentWrapper.className = "doxygen-awesome-fragment-wrapper"
const fragmentCopyButton = document.createElement("doxygen-awesome-fragment-copy-button")
fragmentCopyButton.innerHTML = DoxygenAwesomeFragmentCopyButton.copyIcon
fragmentCopyButton.title = DoxygenAwesomeFragmentCopyButton.title
fragment.parentNode.replaceChild(fragmentWrapper, fragment)
fragmentWrapper.appendChild(fragment)
fragmentWrapper.appendChild(fragmentCopyButton)
}
}
})
})
}
copyContent() {
const content = this.previousSibling.cloneNode(true)
// filter out line number from file listings
content.querySelectorAll(".lineno, .ttc").forEach((node) => {
node.remove()
})
let textContent = content.textContent
// remove trailing newlines that appear in file listings
let numberOfTrailingNewlines = 0
while(textContent.charAt(textContent.length - (numberOfTrailingNewlines + 1)) == '\n') {
numberOfTrailingNewlines++;
}
textContent = textContent.substring(0, textContent.length - numberOfTrailingNewlines)
navigator.clipboard.writeText(textContent);
this.classList.add("success")
this.innerHTML = DoxygenAwesomeFragmentCopyButton.successIcon
window.setTimeout(() => {
this.classList.remove("success")
this.innerHTML = DoxygenAwesomeFragmentCopyButton.copyIcon
}, DoxygenAwesomeFragmentCopyButton.successDuration);
}
}
customElements.define("doxygen-awesome-fragment-copy-button", DoxygenAwesomeFragmentCopyButton)

View File

@@ -1,29 +1,10 @@
// SPDX-License-Identifier: MIT
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
MIT License
Copyright (c) 2022 - 2023 jothepro
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2022 - 2025 jothepro
*/
@@ -55,9 +36,7 @@ class DoxygenAwesomeInteractiveToc {
headerNode: document.getElementById(id)
})
document.getElementById("doc-content")?.addEventListener("scroll", () => {
DoxygenAwesomeInteractiveToc.update()
})
document.getElementById("doc-content")?.addEventListener("scroll",this.throttle(DoxygenAwesomeInteractiveToc.update, 100))
})
DoxygenAwesomeInteractiveToc.update()
}
@@ -78,4 +57,16 @@ class DoxygenAwesomeInteractiveToc {
active?.classList.add("active")
active?.classList.remove("aboveActive")
}
}
static throttle(func, delay) {
let lastCall = 0;
return function (...args) {
const now = new Date().getTime();
if (now - lastCall < delay) {
return;
}
lastCall = now;
return setTimeout(() => {func(...args)}, delay);
};
}
}

View File

@@ -1,30 +1,10 @@
/* SPDX-License-Identifier: MIT */
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
MIT License
Copyright (c) 2021 - 2023 jothepro
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2021 - 2025 jothepro
*/

View File

@@ -1,29 +1,10 @@
/* SPDX-License-Identifier: MIT */
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
MIT License
Copyright (c) 2021 - 2023 jothepro
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2021 - 2025 jothepro
*/
@@ -60,10 +41,6 @@ html {
height: calc(100vh - var(--top-height)) !important;
}
#nav-tree {
padding: 0;
}
#top {
display: block;
border-bottom: none;
@@ -73,22 +50,24 @@ html {
overflow: hidden;
background: var(--side-nav-background);
}
#main-nav {
float: left;
padding-right: 0;
}
.ui-resizable-handle {
cursor: default;
width: 1px !important;
background: var(--separator-color);
box-shadow: 0 calc(-2 * var(--top-height)) 0 0 var(--separator-color);
display: none;
}
.ui-resizable-e {
width: 0;
}
#nav-path {
position: fixed;
right: 0;
left: var(--side-nav-fixed-width);
left: calc(var(--side-nav-fixed-width) + 1px);
bottom: 0;
width: auto;
}
@@ -113,4 +92,14 @@ html {
left: var(--spacing-medium) !important;
right: auto;
}
#nav-sync {
bottom: 4px;
right: auto;
left: 300px;
width: 35px;
top: auto !important;
user-select: none;
position: fixed
}
}

View File

@@ -1,29 +1,10 @@
/* SPDX-License-Identifier: MIT */
/**
Doxygen Awesome
https://github.com/jothepro/doxygen-awesome-css
MIT License
Copyright (c) 2021 - 2023 jothepro
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2021 - 2025 jothepro
*/
@@ -32,6 +13,9 @@ html {
--primary-color: #1779c4;
--primary-dark-color: #335c80;
--primary-light-color: #70b1e9;
--on-primary-color: #ffffff;
--link-color: var(--primary-color);
/* page base colors */
--page-background-color: #ffffff;
@@ -42,14 +26,15 @@ html {
--separator-color: #dedede;
/* border radius for all rounded components. Will affect many components, like dropdowns, memitems, codeblocks, ... */
--border-radius-large: 8px;
--border-radius-small: 4px;
--border-radius-medium: 6px;
--border-radius-large: 10px;
--border-radius-small: 5px;
--border-radius-medium: 8px;
/* default spacings. Most components reference these values for spacing, to provide uniform spacing on the page. */
--spacing-small: 5px;
--spacing-medium: 10px;
--spacing-large: 16px;
--spacing-xlarge: 20px;
/* default box shadow used for raising an element above the normal content. Used in dropdowns, search result, ... */
--box-shadow: 0 2px 8px 0 rgba(0,0,0,.075);
@@ -113,7 +98,7 @@ html {
*/
--menu-display: block;
--menu-focus-foreground: var(--page-background-color);
--menu-focus-foreground: var(--on-primary-color);
--menu-focus-background: var(--primary-color);
--menu-selected-background: rgba(0,0,0,.05);
@@ -310,10 +295,11 @@ body {
font-size: var(--page-font-size);
}
body, table, div, p, dl, #nav-tree .label, .title,
body, table, div, p, dl, #nav-tree .label, #nav-tree a, .title,
.sm-dox a, .sm-dox a:hover, .sm-dox a:focus, #projectname,
.SelectItem, #MSearchField, .navpath li.navelem a,
.navpath li.navelem a:hover, p.reference, p.definition, div.toc li, div.toc h3 {
.navpath li.navelem a:hover, p.reference, p.definition, div.toc li, div.toc h3,
#page-nav ul.page-outline li a {
font-family: var(--font-family);
}
@@ -332,8 +318,13 @@ p.reference, p.definition {
}
a:link, a:visited, a:hover, a:focus, a:active {
color: var(--primary-color) !important;
color: var(--link-color) !important;
font-weight: 500;
background: none;
}
a:hover {
text-decoration: underline;
}
a.anchor {
@@ -348,6 +339,8 @@ a.anchor {
#top {
background: var(--header-background);
border-bottom: 1px solid var(--separator-color);
position: relative;
z-index: 99;
}
@media screen and (min-width: 768px) {
@@ -362,6 +355,7 @@ a.anchor {
#main-nav {
flex-grow: 5;
padding: var(--spacing-small) var(--spacing-medium);
border-bottom: 0;
}
#titlearea {
@@ -441,19 +435,36 @@ a.anchor {
}
.sm-dox a span.sub-arrow {
border-color: var(--header-foreground) transparent transparent transparent;
top: 15px;
right: 10px;
box-sizing: content-box;
padding: 0;
margin: 0;
display: inline-block;
width: 5px;
height: 5px;
transform: rotate(45deg);
border-width: 0;
border-right: 2px solid var(--header-foreground);
border-bottom: 2px solid var(--header-foreground);
background: none;
}
.sm-dox a:hover span.sub-arrow {
border-color: var(--menu-focus-foreground) transparent transparent transparent;
border-color: var(--menu-focus-foreground);
background: none;
}
.sm-dox ul a span.sub-arrow {
border-color: transparent transparent transparent var(--page-foreground-color);
transform: rotate(-45deg);
border-width: 0;
border-right: 2px solid var(--header-foreground);
border-bottom: 2px solid var(--header-foreground);
}
.sm-dox ul a:hover span.sub-arrow {
border-color: transparent transparent transparent var(--menu-focus-foreground);
border-color: var(--menu-focus-foreground);
background: none;
}
}
@@ -480,7 +491,7 @@ a.anchor {
.sm-dox ul a {
color: var(--page-foreground-color) !important;
background: var(--page-background-color);
background: none;
font-size: var(--navigation-font-size);
}
@@ -552,6 +563,13 @@ a.anchor {
box-shadow: none;
display: block;
margin-top: 0;
margin-right: 0;
}
@media (min-width: 768px) {
.sm-dox li {
padding: 0;
}
}
/* until Doxygen 1.9.4 */
@@ -573,6 +591,17 @@ a.anchor {
padding-left: 0
}
/* Doxygen 1.14.0 */
.search-icon::before {
background: none;
top: 5px;
}
.search-icon::after {
background: none;
top: 12px;
}
.SelectionMark {
user-select: none;
}
@@ -776,12 +805,15 @@ html.dark-mode iframe#MSearchResults {
*/
#side-nav {
padding: 0 !important;
background: var(--side-nav-background);
min-width: 8px;
max-width: 50vw;
}
#nav-tree, #top {
border-right: 1px solid var(--separator-color);
}
@media screen and (max-width: 767px) {
#side-nav {
display: none;
@@ -790,34 +822,95 @@ html.dark-mode iframe#MSearchResults {
#doc-content {
margin-left: 0 !important;
}
#top {
border-right: none;
}
}
#nav-tree {
background: transparent;
margin-right: 1px;
background: var(--side-nav-background);
margin-right: -1px;
padding: 0;
}
#nav-tree .label {
font-size: var(--navigation-font-size);
line-height: var(--tree-item-height);
}
#nav-tree span.label a:hover {
background: none;
}
#nav-tree .item {
height: var(--tree-item-height);
line-height: var(--tree-item-height);
overflow: hidden;
text-overflow: ellipsis;
margin: 0;
padding: 0;
}
#nav-tree-contents {
margin: 0;
}
#main-menu > li:last-child {
height: auto;
}
#nav-tree .item > a:focus {
outline: none;
}
#nav-sync {
bottom: 12px;
right: 12px;
bottom: var(--spacing-medium);
right: var(--spacing-medium) !important;
top: auto !important;
user-select: none;
}
div.nav-sync-icon {
border: 1px solid var(--separator-color);
border-radius: var(--border-radius-medium);
background: var(--page-background-color);
width: 30px;
height: 20px;
}
div.nav-sync-icon:hover {
background: var(--page-background-color);
}
span.sync-icon-left, div.nav-sync-icon:hover span.sync-icon-left {
border-left: 2px solid var(--primary-color);
border-top: 2px solid var(--primary-color);
top: 5px;
left: 6px;
}
span.sync-icon-right, div.nav-sync-icon:hover span.sync-icon-right {
border-right: 2px solid var(--primary-color);
border-bottom: 2px solid var(--primary-color);
top: 5px;
left: initial;
right: 6px;
}
div.nav-sync-icon.active::after, div.nav-sync-icon.active:hover::after {
border-top: 2px solid var(--primary-color);
top: 9px;
left: 6px;
width: 19px;
}
#nav-tree .selected {
text-shadow: none;
background-image: none;
background-color: transparent;
position: relative;
color: var(--primary-color) !important;
font-weight: 500;
}
#nav-tree .selected::after {
@@ -843,9 +936,27 @@ html.dark-mode iframe#MSearchResults {
#nav-tree .arrow {
opacity: var(--side-nav-arrow-opacity);
background: none;
}
.arrow {
#nav-tree span.arrowhead {
margin: 0 0 1px 2px;
}
span.arrowhead {
border-color: var(--primary-light-color);
}
.selected span.arrowhead {
border-color: var(--primary-color);
}
#nav-tree-contents > ul > li:first-child > div > a {
opacity: 0;
pointer-events: none;
}
.contents .arrow {
color: inherit;
cursor: pointer;
font-size: 45%;
@@ -853,7 +964,7 @@ html.dark-mode iframe#MSearchResults {
margin-right: 2px;
font-family: serif;
height: auto;
text-align: right;
padding-bottom: 4px;
}
#nav-tree div.item:hover .arrow, #nav-tree a:focus .arrow {
@@ -867,9 +978,11 @@ html.dark-mode iframe#MSearchResults {
}
.ui-resizable-e {
width: 4px;
background: transparent;
box-shadow: inset -1px 0 0 0 var(--separator-color);
background: none;
}
.ui-resizable-e:hover {
background: var(--separator-color);
}
/*
@@ -878,7 +991,7 @@ html.dark-mode iframe#MSearchResults {
div.header {
border-bottom: 1px solid var(--separator-color);
background-color: var(--page-background-color);
background: none;
background-image: none;
}
@@ -917,7 +1030,7 @@ div.headertitle {
div.header .title {
font-weight: 600;
font-size: 225%;
padding: var(--spacing-medium) var(--spacing-large);
padding: var(--spacing-medium) var(--spacing-xlarge);
word-break: break-word;
}
@@ -934,9 +1047,10 @@ td.memSeparator {
span.mlabel {
background: var(--primary-color);
color: var(--on-primary-color);
border: none;
padding: 4px 9px;
border-radius: 12px;
border-radius: var(--border-radius-large);
margin-right: var(--spacing-medium);
}
@@ -945,7 +1059,7 @@ span.mlabel:last-of-type {
}
div.contents {
padding: 0 var(--spacing-large);
padding: 0 var(--spacing-xlarge);
}
div.contents p, div.contents li {
@@ -956,6 +1070,16 @@ div.contents div.dyncontent {
margin: var(--spacing-medium) 0;
}
@media screen and (max-width: 767px) {
div.contents {
padding: 0 var(--spacing-large);
}
div.header .title {
padding: var(--spacing-medium) var(--spacing-large);
}
}
@media (prefers-color-scheme: dark) {
html:not(.light-mode) div.contents div.dyncontent img,
html:not(.light-mode) div.contents center img,
@@ -979,7 +1103,7 @@ html.dark-mode div.contents .dotgraph iframe
filter: brightness(89%) hue-rotate(180deg) invert();
}
h2.groupheader {
td h2.groupheader, h2.groupheader {
border-bottom: 0px;
color: var(--page-foreground-color);
box-shadow:
@@ -1040,7 +1164,7 @@ blockquote::after {
blockquote p {
margin: var(--spacing-small) 0 var(--spacing-medium) 0;
}
.paramname {
.paramname, .paramname em {
font-weight: 600;
color: var(--primary-dark-color);
}
@@ -1090,7 +1214,7 @@ div.contents .toc {
border: 0;
border-left: 1px solid var(--separator-color);
border-radius: 0;
background-color: transparent;
background-color: var(--page-background-color);
box-shadow: none;
position: sticky;
top: var(--toc-sticky-top);
@@ -1198,24 +1322,115 @@ div.toc li a.aboveActive {
}
}
/*
Page Outline (Doxygen >= 1.14.0)
*/
#page-nav {
background: var(--page-background-color);
border-left: 1px solid var(--separator-color);
}
#page-nav #page-nav-resize-handle {
background: var(--separator-color);
}
#page-nav #page-nav-resize-handle::after {
border-left: 1px solid var(--primary-color);
border-right: 1px solid var(--primary-color);
}
#page-nav #page-nav-tree #page-nav-contents {
top: var(--spacing-large);
}
#page-nav ul.page-outline {
margin: 0;
padding: 0;
}
#page-nav ul.page-outline li a {
font-size: var(--toc-font-size) !important;
color: var(--page-secondary-foreground-color) !important;
display: inline-block;
line-height: calc(2 * var(--toc-font-size));
}
#page-nav ul.page-outline li a a.anchorlink {
display: none;
}
#page-nav ul.page-outline li.vis ~ * a {
color: var(--page-foreground-color) !important;
}
#page-nav ul.page-outline li.vis:not(.vis ~ .vis) a, #page-nav ul.page-outline li a:hover {
color: var(--primary-color) !important;
}
#page-nav ul.page-outline .vis {
background: var(--page-background-color);
position: relative;
}
#page-nav ul.page-outline .vis::after {
content: "";
position: absolute;
top: 0;
bottom: 0;
left: 0;
width: 4px;
background: var(--page-secondary-foreground-color);
}
#page-nav ul.page-outline .vis:not(.vis ~ .vis)::after {
top: 1px;
border-top-right-radius: var(--border-radius-small);
}
#page-nav ul.page-outline .vis:not(:has(~ .vis))::after {
bottom: 1px;
border-bottom-right-radius: var(--border-radius-small);
}
#page-nav ul.page-outline .arrow {
display: inline-block;
}
#page-nav ul.page-outline .arrow span {
display: none;
}
@media screen and (max-width: 767px) {
#container {
grid-template-columns: initial !important;
}
#page-nav {
display: none;
}
}
/*
Code & Fragments
*/
code, div.fragment, pre.fragment {
border-radius: var(--border-radius-small);
code, div.fragment, pre.fragment, span.tt {
border: 1px solid var(--separator-color);
overflow: hidden;
}
code {
code, span.tt {
display: inline;
background: var(--code-background);
color: var(--code-foreground);
padding: 2px 6px;
border-radius: var(--border-radius-small);
}
div.fragment, pre.fragment {
border-radius: var(--border-radius-medium);
margin: var(--spacing-medium) 0;
padding: calc(var(--spacing-large) - (var(--spacing-large) / 6)) var(--spacing-large);
background: var(--fragment-background);
@@ -1273,7 +1488,7 @@ div.fragment, pre.fragment {
}
}
code, code a, pre.fragment, div.fragment, div.fragment .line, div.fragment span, div.fragment .line a, div.fragment .line span {
code, code a, pre.fragment, div.fragment, div.fragment .line, div.fragment span, div.fragment .line a, div.fragment .line span, span.tt {
font-family: var(--font-family-monospace);
font-size: var(--code-font-size) !important;
}
@@ -1347,6 +1562,10 @@ div.line.glow {
dl warning, attention, note, deprecated, bug, ...
*/
dl {
line-height: calc(1.65 * var(--page-font-size));
}
dl.bug dt a, dl.deprecated dt a, dl.todo dt a {
font-weight: bold !important;
}
@@ -1512,6 +1731,7 @@ div.memitem {
border-top-right-radius: var(--border-radius-medium);
border-bottom-right-radius: var(--border-radius-medium);
border-bottom-left-radius: var(--border-radius-medium);
border-top-left-radius: 0;
overflow: hidden;
display: block !important;
}
@@ -1743,7 +1963,7 @@ table.fieldtable th {
color: var(--tablehead-foreground);
}
table.fieldtable td.fieldtype, .fieldtable td.fieldname, .fieldtable td.fielddoc, .fieldtable th {
table.fieldtable td.fieldtype, .fieldtable td.fieldname, .fieldtable td.fieldinit, .fieldtable td.fielddoc, .fieldtable th {
border-bottom: 1px solid var(--separator-color);
border-right: 1px solid var(--separator-color);
}
@@ -1778,8 +1998,10 @@ table.memberdecls tr[class^='memitem'] .memTemplParams {
white-space: normal;
}
table.memberdecls .memItemLeft,
table.memberdecls .memItemRight,
table.memberdecls tr.heading + tr[class^='memitem'] td.memItemLeft,
table.memberdecls tr.heading + tr[class^='memitem'] td.memItemRight,
table.memberdecls td.memItemLeft,
table.memberdecls td.memItemRight,
table.memberdecls .memTemplItemLeft,
table.memberdecls .memTemplItemRight,
table.memberdecls .memTemplParams {
@@ -1791,8 +2013,34 @@ table.memberdecls .memTemplParams {
background-color: var(--fragment-background);
}
@media screen and (min-width: 768px) {
tr.heading + tr[class^='memitem'] td.memItemRight, tr.groupHeader + tr[class^='memitem'] td.memItemRight, tr.inherit_header + tr[class^='memitem'] td.memItemRight {
border-top-right-radius: var(--border-radius-small);
}
table.memberdecls tr:last-child td.memItemRight, table.memberdecls tr:last-child td.mdescRight, table.memberdecls tr[class^='memitem']:has(+ tr.groupHeader) td.memItemRight, table.memberdecls tr[class^='memitem']:has(+ tr.inherit_header) td.memItemRight, table.memberdecls tr[class^='memdesc']:has(+ tr.groupHeader) td.mdescRight, table.memberdecls tr[class^='memdesc']:has(+ tr.inherit_header) td.mdescRight {
border-bottom-right-radius: var(--border-radius-small);
}
table.memberdecls tr:last-child td.memItemLeft, table.memberdecls tr:last-child td.mdescLeft, table.memberdecls tr[class^='memitem']:has(+ tr.groupHeader) td.memItemLeft, table.memberdecls tr[class^='memitem']:has(+ tr.inherit_header) td.memItemLeft, table.memberdecls tr[class^='memdesc']:has(+ tr.groupHeader) td.mdescLeft, table.memberdecls tr[class^='memdesc']:has(+ tr.inherit_header) td.mdescLeft {
border-bottom-left-radius: var(--border-radius-small);
}
tr.heading + tr[class^='memitem'] td.memItemLeft, tr.groupHeader + tr[class^='memitem'] td.memItemLeft, tr.inherit_header + tr[class^='memitem'] td.memItemLeft {
border-top-left-radius: var(--border-radius-small);
}
}
table.memname td.memname {
font-size: var(--memname-font-size);
}
table.memberdecls .memTemplItemLeft,
table.memberdecls .memTemplItemRight {
table.memberdecls .template .memItemLeft,
table.memberdecls .memTemplItemRight,
table.memberdecls .template .memItemRight {
padding-top: 2px;
}
@@ -1804,13 +2052,13 @@ table.memberdecls .memTemplParams {
padding-bottom: var(--spacing-small);
}
table.memberdecls .memTemplItemLeft {
table.memberdecls .memTemplItemLeft, table.memberdecls .template .memItemLeft {
border-radius: 0 0 0 var(--border-radius-small);
border-left: 1px solid var(--separator-color);
border-top: 0;
}
table.memberdecls .memTemplItemRight {
table.memberdecls .memTemplItemRight, table.memberdecls .template .memItemRight {
border-radius: 0 0 var(--border-radius-small) 0;
border-right: 1px solid var(--separator-color);
padding-left: 0;
@@ -1836,8 +2084,14 @@ table.memberdecls .mdescLeft, table.memberdecls .mdescRight {
background: none;
color: var(--page-foreground-color);
padding: var(--spacing-small) 0;
border: 0;
}
table.memberdecls [class^="memdesc"] {
box-shadow: none;
}
table.memberdecls .memItemLeft,
table.memberdecls .memTemplItemLeft {
padding-right: var(--spacing-medium);
@@ -1860,6 +2114,10 @@ table.memberdecls .inherit_header td {
color: var(--page-secondary-foreground-color);
}
table.memberdecls span.dynarrow {
left: 10px;
}
table.memberdecls img[src="closed.png"],
table.memberdecls img[src="open.png"],
div.dynheader img[src="open.png"],
@@ -1876,6 +2134,10 @@ div.dynheader img[src="closed.png"] {
transition: transform var(--animation-duration) ease-out;
}
tr.heading + tr[class^='memitem'] td.memItemLeft, tr.groupHeader + tr[class^='memitem'] td.memItemLeft, tr.inherit_header + tr[class^='memitem'] td.memItemLeft, tr.heading + tr[class^='memitem'] td.memItemRight, tr.groupHeader + tr[class^='memitem'] td.memItemRight, tr.inherit_header + tr[class^='memitem'] td.memItemRight {
border-top: 1px solid var(--separator-color);
}
table.memberdecls img {
margin-right: 10px;
}
@@ -1900,7 +2162,10 @@ div.dynheader img[src="closed.png"] {
table.memberdecls .mdescRight,
table.memberdecls .memTemplItemLeft,
table.memberdecls .memTemplItemRight,
table.memberdecls .memTemplParams {
table.memberdecls .memTemplParams,
table.memberdecls .template .memItemLeft,
table.memberdecls .template .memItemRight,
table.memberdecls .template .memParams {
display: block;
text-align: left;
padding-left: var(--spacing-large);
@@ -1913,12 +2178,14 @@ div.dynheader img[src="closed.png"] {
table.memberdecls .memItemLeft,
table.memberdecls .mdescLeft,
table.memberdecls .memTemplItemLeft {
border-bottom: 0;
padding-bottom: 0;
table.memberdecls .memTemplItemLeft,
table.memberdecls .template .memItemLeft {
border-bottom: 0 !important;
padding-bottom: 0 !important;
}
table.memberdecls .memTemplItemLeft {
table.memberdecls .memTemplItemLeft,
table.memberdecls .template .memItemLeft {
padding-top: 0;
}
@@ -1928,10 +2195,12 @@ div.dynheader img[src="closed.png"] {
table.memberdecls .memItemRight,
table.memberdecls .mdescRight,
table.memberdecls .memTemplItemRight {
border-top: 0;
padding-top: 0;
table.memberdecls .memTemplItemRight,
table.memberdecls .template .memItemRight {
border-top: 0 !important;
padding-top: 0 !important;
padding-right: var(--spacing-large);
padding-bottom: var(--spacing-medium);
overflow-x: auto;
}
@@ -1966,6 +2235,22 @@ div.dynheader img[src="closed.png"] {
max-height: 200px;
}
}
tr.heading + tr[class^='memitem'] td.memItemRight, tr.groupHeader + tr[class^='memitem'] td.memItemRight, tr.inherit_header + tr[class^='memitem'] td.memItemRight {
border-top-right-radius: 0;
}
table.memberdecls tr:last-child td.memItemRight, table.memberdecls tr:last-child td.mdescRight, table.memberdecls tr[class^='memitem']:has(+ tr.groupHeader) td.memItemRight, table.memberdecls tr[class^='memitem']:has(+ tr.inherit_header) td.memItemRight, table.memberdecls tr[class^='memdesc']:has(+ tr.groupHeader) td.mdescRight, table.memberdecls tr[class^='memdesc']:has(+ tr.inherit_header) td.mdescRight {
border-bottom-right-radius: 0;
}
table.memberdecls tr:last-child td.memItemLeft, table.memberdecls tr:last-child td.mdescLeft, table.memberdecls tr[class^='memitem']:has(+ tr.groupHeader) td.memItemLeft, table.memberdecls tr[class^='memitem']:has(+ tr.inherit_header) td.memItemLeft, table.memberdecls tr[class^='memdesc']:has(+ tr.groupHeader) td.mdescLeft, table.memberdecls tr[class^='memdesc']:has(+ tr.inherit_header) td.mdescLeft {
border-bottom-left-radius: 0;
}
tr.heading + tr[class^='memitem'] td.memItemLeft, tr.groupHeader + tr[class^='memitem'] td.memItemLeft, tr.inherit_header + tr[class^='memitem'] td.memItemLeft {
border-top-left-radius: 0;
}
}
@@ -1982,14 +2267,16 @@ hr {
}
.contents hr {
box-shadow: 100px 0 0 var(--separator-color),
-100px 0 0 var(--separator-color),
500px 0 0 var(--separator-color),
-500px 0 0 var(--separator-color),
1500px 0 0 var(--separator-color),
-1500px 0 0 var(--separator-color),
2000px 0 0 var(--separator-color),
-2000px 0 0 var(--separator-color);
box-shadow: 100px 0 var(--separator-color),
-100px 0 var(--separator-color),
500px 0 var(--separator-color),
-500px 0 var(--separator-color),
900px 0 var(--separator-color),
-900px 0 var(--separator-color),
1400px 0 var(--separator-color),
-1400px 0 var(--separator-color),
1900px 0 var(--separator-color),
-1900px 0 var(--separator-color);
}
.contents img, .contents .center, .contents center, .contents div.image object {
@@ -2152,9 +2439,7 @@ div.qindex {
background: var(--page-background-color);
border: none;
border-top: 1px solid var(--separator-color);
border-bottom: 1px solid var(--separator-color);
border-bottom: 0;
box-shadow: 0 0.75px 0 var(--separator-color);
font-size: var(--navigation-font-size);
}
@@ -2183,6 +2468,10 @@ address.footer {
color: var(--primary-color) !important;
}
.navpath li.navelem a:hover {
text-shadow: none;
}
.navpath li.navelem b {
color: var(--primary-dark-color);
font-weight: 500;
@@ -2201,7 +2490,11 @@ li.navelem:first-child:before {
display: none;
}
#nav-path li.navelem:after {
#nav-path ul {
padding-left: 0;
}
#nav-path li.navelem:has(.el):after {
content: '';
border: 5px solid var(--page-background-color);
border-bottom-color: transparent;
@@ -2212,7 +2505,21 @@ li.navelem:first-child:before {
margin-left: 6px;
}
#nav-path li.navelem:before {
#nav-path li.navelem:not(:has(.el)):after {
background: var(--page-background-color);
box-shadow: 1px -1px 0 1px var(--separator-color);
border-radius: 0 var(--border-radius-medium) 0 50px;
}
#nav-path li.navelem:not(:has(.el)) {
margin-left: 0;
}
#nav-path li.navelem:not(:has(.el)):hover, #nav-path li.navelem:not(:has(.el)):hover:after {
background-color: var(--separator-color);
}
#nav-path li.navelem:has(.el):before {
content: '';
border: 5px solid var(--separator-color);
border-bottom-color: transparent;
@@ -2338,7 +2645,7 @@ doxygen-awesome-dark-mode-toggle {
height: var(--searchbar-height);
background: none;
border: none;
border-radius: var(--searchbar-height);
border-radius: var(--searchbar-border-radius);
vertical-align: middle;
text-align: center;
line-height: var(--searchbar-height);
@@ -2523,6 +2830,7 @@ h2:hover a.anchorlink, h1:hover a.anchorlink, h3:hover a.anchorlink, h4:hover a.
float: left;
white-space: nowrap;
font-weight: normal;
font-family: var(--font-family);
padding: calc(var(--spacing-large) / 2) var(--spacing-large);
border-radius: var(--border-radius-medium);
transition: background-color var(--animation-duration) ease-in-out, font-weight var(--animation-duration) ease-in-out;
@@ -2667,3 +2975,46 @@ h2:hover a.anchorlink, h1:hover a.anchorlink, h3:hover a.anchorlink, h4:hover a.
border-radius: 0 var(--border-radius-medium) var(--border-radius-medium) 0;
}
}
/*
Bordered image
*/
html.dark-mode .darkmode_inverted_image img, /* < doxygen 1.9.3 */
html.dark-mode .darkmode_inverted_image object[type="image/svg+xml"] /* doxygen 1.9.3 */ {
filter: brightness(89%) hue-rotate(180deg) invert();
}
.bordered_image {
border-radius: var(--border-radius-small);
border: 1px solid var(--separator-color);
display: inline-block;
overflow: hidden;
}
.bordered_image:empty {
border: none;
}
html.dark-mode .bordered_image img, /* < doxygen 1.9.3 */
html.dark-mode .bordered_image object[type="image/svg+xml"] /* doxygen 1.9.3 */ {
border-radius: var(--border-radius-small);
}
/*
Button
*/
.primary-button {
display: inline-block;
cursor: pointer;
background: var(--primary-color);
color: var(--page-background-color) !important;
border-radius: var(--border-radius-medium);
padding: var(--spacing-small) var(--spacing-medium);
text-decoration: none;
}
.primary-button:hover {
background: var(--primary-dark-color);
}

View File

@@ -0,0 +1,129 @@
.github-corner svg {
fill: var(--primary-light-color);
color: var(--page-background-color);
width: 72px;
height: 72px;
}
@media screen and (max-width: 767px) {
.github-corner svg {
width: 50px;
height: 50px;
}
#projectnumber {
margin-right: 22px;
}
}
.title_screenshot {
filter: drop-shadow(0px 3px 10px rgba(0,0,0,0.22));
max-width: 500px;
margin: var(--spacing-large) 0;
}
.title_screenshot .caption {
display: none;
}
#theme-selection {
position: fixed;
bottom: 0;
left: 0;
background: var(--side-nav-background);
padding: 5px 2px 5px 8px;
box-shadow: 0 -4px 4px -2px var(--side-nav-background);
display: flex;
}
#theme-selection label {
border: 1px solid var(--separator-color);
border-right: 0;
color: var(--page-foreground-color);
font-size: var(--toc-font-size);
padding: 0 8px;
display: inline-block;
height: 22px;
box-sizing: border-box;
border-radius: var(--border-radius-medium) 0 0 var(--border-radius-medium);
line-height: 20px;
background: var(--page-background-color);
opacity: 0.7;
}
@media (prefers-color-scheme: dark) {
html:not(.light-mode) #theme-select {
background: url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' width='100' height='100' fill='%23aaaaaa'><polygon points='0,0 100,0 50,50'/></svg>") no-repeat;
background-size: 8px;
background-position: calc(100% - 6px) 65%;
background-color: var(--page-background-color);
}
}
html.dark-mode #theme-select {
background: url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' width='100' height='100' fill='%23aaaaaa'><polygon points='0,0 100,0 50,50'/></svg>") no-repeat;
background-size: 8px;
background-position: calc(100% - 6px) 65%;
background-color: var(--page-background-color);
}
#theme-select {
border: 1px solid var(--separator-color);
border-radius: 0 var(--border-radius-medium) var(--border-radius-medium) 0;
padding: 0;
height: 22px;
font-size: var(--toc-font-size);
font-family: var(--font-family);
width: 215px;
color: var(--primary-color);
border-left: 0;
display: inline-block;
opacity: 0.7;
outline: none;
-webkit-appearance: none;
-moz-appearance: none;
appearance: none;
background: url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' width='100' height='100' fill='%23888888'><polygon points='0,0 100,0 50,50'/></svg>") no-repeat;
background-size: 8px;
background-position: calc(100% - 6px) 65%;
background-repeat: no-repeat;
background-color: var(--page-background-color);
}
#theme-selection:hover #theme-select, #theme-selection:hover label {
opacity: 1;
}
#nav-tree-contents {
margin-bottom: 30px;
}
@media screen and (max-width: 767px) {
#theme-selection {
box-shadow: none;
background: none;
height: 20px;
}
#theme-select {
width: 80px;
opacity: 1;
}
#theme-selection label {
opacity: 1;
}
#nav-path ul li.navelem:first-child {
margin-left: 160px;
}
ul li.footer:not(:first-child) {
display: none;
}
#nav-path {
position: fixed;
bottom: 0;
background: var(--page-background-color);
}
}

View File

@@ -0,0 +1,98 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen $doxygenversion"/>
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<!-- BEGIN opengraph metadata -->
<meta property="og:title" content="Doxygen Awesome" />
<meta property="og:image" content="https://repository-images.githubusercontent.com/348492097/4f16df80-88fb-11eb-9d31-4015ff22c452" />
<meta property="og:description" content="Custom CSS theme for doxygen html-documentation with lots of customization parameters." />
<meta property="og:url" content="https://jothepro.github.io/doxygen-awesome-css/" />
<!-- END opengraph metadata -->
<!-- BEGIN twitter metadata -->
<meta name="twitter:image:src" content="https://repository-images.githubusercontent.com/348492097/4f16df80-88fb-11eb-9d31-4015ff22c452" />
<meta name="twitter:title" content="Doxygen Awesome" />
<meta name="twitter:description" content="Custom CSS theme for doxygen html-documentation with lots of customization parameters." />
<!-- END twitter metadata -->
<!--BEGIN PROJECT_NAME--><title>$projectname: $title</title><!--END PROJECT_NAME-->
<!--BEGIN !PROJECT_NAME--><title>$title</title><!--END !PROJECT_NAME-->
<link href="$relpath^tabs.css" rel="stylesheet" type="text/css"/>
<link rel="icon" type="image/svg+xml" href="logo.drawio.svg"/>
<script type="text/javascript" src="$relpath^jquery.js"></script>
<script type="text/javascript" src="$relpath^dynsections.js"></script>
<script type="text/javascript" src="$relpath^doxygen-awesome-darkmode-toggle.js"></script>
<script type="text/javascript" src="$relpath^doxygen-awesome-fragment-copy-button.js"></script>
<script type="text/javascript" src="$relpath^doxygen-awesome-paragraph-link.js"></script>
<script type="text/javascript" src="$relpath^doxygen-awesome-interactive-toc.js"></script>
<script type="text/javascript" src="$relpath^doxygen-awesome-tabs.js"></script>
<script type="text/javascript" src="$relpath^toggle-alternative-theme.js"></script>
<script type="text/javascript">
DoxygenAwesomeFragmentCopyButton.init()
DoxygenAwesomeDarkModeToggle.init()
DoxygenAwesomeParagraphLink.init()
DoxygenAwesomeInteractiveToc.init()
DoxygenAwesomeTabs.init()
</script>
$treeview
$search
$mathjax
<link href="$relpath^$stylesheet" rel="stylesheet" type="text/css" />
$extrastylesheet
</head>
<body>
<!-- https://tholman.com/github-corners/ -->
<a href="https://github.com/jothepro/doxygen-awesome-css" class="github-corner" title="View source on GitHub" target="_blank" rel="noopener noreferrer">
<svg viewBox="0 0 250 250" width="40" height="40" style="position: absolute; top: 0; border: 0; right: 0; z-index: 99;" aria-hidden="true">
<path d="M0,0 L115,115 L130,115 L142,142 L250,250 L250,0 Z"></path><path d="M128.3,109.0 C113.8,99.7 119.0,89.6 119.0,89.6 C122.0,82.7 120.5,78.6 120.5,78.6 C119.2,72.0 123.4,76.3 123.4,76.3 C127.3,80.9 125.5,87.3 125.5,87.3 C122.9,97.6 130.6,101.9 134.4,103.2" fill="currentColor" style="transform-origin: 130px 106px;" class="octo-arm"></path><path d="M115.0,115.0 C114.9,115.1 118.7,116.5 119.8,115.4 L133.7,101.6 C136.9,99.2 139.9,98.4 142.2,98.6 C133.8,88.0 127.5,74.4 143.8,58.0 C148.5,53.4 154.0,51.2 159.7,51.0 C160.3,49.4 163.2,43.6 171.4,40.1 C171.4,40.1 176.1,42.5 178.8,56.2 C183.1,58.6 187.2,61.8 190.9,65.4 C194.5,69.0 197.7,73.2 200.1,77.6 C213.8,80.2 216.3,84.9 216.3,84.9 C212.7,93.1 206.9,96.0 205.4,96.6 C205.1,102.4 203.0,107.8 198.3,112.5 C181.9,128.9 168.3,122.5 157.7,114.1 C157.9,116.9 156.7,120.9 152.7,124.9 L141.0,136.5 C139.8,137.7 141.6,141.9 141.8,141.8 Z" fill="currentColor" class="octo-body"></path></svg></a><style>.github-corner:hover .octo-arm{animation:octocat-wave 560ms ease-in-out}@keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}@media (max-width:500px){.github-corner:hover .octo-arm{animation:none}.github-corner .octo-arm{animation:octocat-wave 560ms ease-in-out}}</style>
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<!--BEGIN TITLEAREA-->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<!--BEGIN PROJECT_LOGO-->
<td id="projectlogo"><img alt="Logo" src="$relpath^$projectlogo"/></td>
<!--END PROJECT_LOGO-->
<!--BEGIN PROJECT_NAME-->
<td id="projectalign" style="padding-left: 0.5em;">
<div id="projectname">$projectname
<!--BEGIN PROJECT_NUMBER-->&#160;<span id="projectnumber">$projectnumber</span><!--END PROJECT_NUMBER-->
</div>
<!--BEGIN PROJECT_BRIEF--><div id="projectbrief">$projectbrief</div><!--END PROJECT_BRIEF-->
</td>
<!--END PROJECT_NAME-->
<!--BEGIN !PROJECT_NAME-->
<!--BEGIN PROJECT_BRIEF-->
<td style="padding-left: 0.5em;">
<div id="projectbrief">$projectbrief</div>
</td>
<!--END PROJECT_BRIEF-->
<!--END !PROJECT_NAME-->
<!--BEGIN DISABLE_INDEX-->
<!--BEGIN SEARCHENGINE-->
<td>$searchbox</td>
<!--END SEARCHENGINE-->
<!--END DISABLE_INDEX-->
</tr>
</tbody>
</table>
<div id="theme-selection">
<label for="theme-select">Theme:</label>
<select id="theme-select">
<option value="theme-default">Default</option>
<option value="theme-round">Round</option>
<option value="theme-robot">Robot</option>
</select>
</div>
</div>
<!--END TITLEAREA-->
<!-- end header part -->

View File

@@ -0,0 +1,62 @@
html.theme-robot {
/* primary theme color. This will affect the entire websites color scheme: links, arrows, labels, ... */
--primary-color: #1c89a4;
--primary-dark-color: #1a6f84;
--primary-light-color: #5abcd4;
--primary-lighter-color: #cae1f1;
--primary-lightest-color: #e9f1f8;
--fragment-background: #ececec;
--code-background: #ececec;
/* page base colors */
--page-background-color: white;
--page-foreground-color: #2c3e50;
--page-secondary-foreground-color: #67727e;
--border-radius-large: 0px;
--border-radius-small: 0px;
--border-radius-medium: 0px;
--spacing-small: 3px;
--spacing-medium: 6px;
--spacing-large: 12px;
--top-height: 125px;
--side-nav-background: var(--page-background-color);
--side-nav-foreground: var(--page-foreground-color);
--header-foreground: var(--side-nav-foreground);
--searchbar-border-radius: var(--border-radius-medium);
--header-background: var(--side-nav-background);
--header-foreground: var(--side-nav-foreground);
--toc-background: rgb(243, 240, 252);
--toc-foreground: var(--page-foreground-color);
--font-family: ui-monospace,SFMono-Regular,SF Mono,Menlo,Consolas,Liberation Mono,monospace;
--page-font-size: 14px;
--box-shadow: none;
--separator-color: #cdcdcd;
}
html.theme-robot.dark-mode {
color-scheme: dark;
--primary-color: #49cad3;
--primary-dark-color: #8ed2d7;
--primary-light-color: #377479;
--primary-lighter-color: #191e21;
--primary-lightest-color: #191a1c;
--fragment-background: #000000;
--code-background: #000000;
--page-background-color: #161616;
--page-foreground-color: #d2dbde;
--page-secondary-foreground-color: #555555;
--separator-color: #545454;
--toc-background: #20142C;
}

View File

@@ -0,0 +1,55 @@
html.theme-round {
/* primary theme color. This will affect the entire websites color scheme: links, arrows, labels, ... */
--primary-color: #AF7FE4;
--primary-dark-color: #9270E4;
--primary-light-color: #d2b7ef;
--primary-lighter-color: #cae1f1;
--primary-lightest-color: #e9f1f8;
/* page base colors */
--page-background-color: white;
--page-foreground-color: #2c3e50;
--page-secondary-foreground-color: #67727e;
--border-radius-large: 22px;
--border-radius-small: 9px;
--border-radius-medium: 14px;
--spacing-small: 8px;
--spacing-medium: 14px;
--spacing-large: 19px;
--spacing-xlarge: 21px;
--top-height: 125px;
--side-nav-background: #324067;
--side-nav-foreground: #F1FDFF;
--header-foreground: var(--side-nav-foreground);
--searchbar-background: var(--side-nav-foreground);
--searchbar-border-radius: var(--border-radius-medium);
--header-background: var(--side-nav-background);
--header-foreground: var(--side-nav-foreground);
--toc-background: rgb(243, 240, 252);
--toc-foreground: var(--page-foreground-color);
}
html.theme-round.dark-mode {
color-scheme: dark;
--primary-color: #AF7FE4;
--primary-dark-color: #715292;
--primary-light-color: #ae97c7;
--primary-lighter-color: #191e21;
--primary-lightest-color: #191a1c;
--page-background-color: #1C1D1F;
--page-foreground-color: #d2dbde;
--page-secondary-foreground-color: #859399;
--separator-color: #3a3246;
--side-nav-background: #171D32;
--side-nav-foreground: #F1FDFF;
--toc-background: #20142C;
--searchbar-background: var(--page-background-color);
}

View File

@@ -0,0 +1,52 @@
// Toggle zwischen drei Theme-Zuständen und speichere im localStorage
const THEME_CLASSES = ['theme-default', 'theme-round', 'theme-robot'];
// Ermögliche das Umschalten per Button/Funktion (z.B. für onclick im HTML)
function toggleThemeVariant() {
let idx = getCurrentThemeIndex();
idx = (idx + 1) % THEME_CLASSES.length;
applyThemeClass(idx);
}
// Funktion global verfügbar machen
window.toggleThemeVariant = toggleThemeVariant;
function getCurrentThemeIndex() {
const stored = localStorage.getItem('theme-variant');
if (stored === null) return 0;
const idx = THEME_CLASSES.indexOf(stored);
return idx === -1 ? 0 : idx;
}
function applyThemeClass(idx) {
document.documentElement.classList.remove(...THEME_CLASSES);
if (THEME_CLASSES[idx] && THEME_CLASSES[idx] !== 'theme-default') {
document.documentElement.classList.add(THEME_CLASSES[idx]);
}
localStorage.setItem('theme-variant', THEME_CLASSES[idx] || 'theme-default');
// Select synchronisieren, falls vorhanden
const select = document.getElementById('theme-select');
if (select) select.value = THEME_CLASSES[idx];
}
function setThemeByName(themeName) {
const idx = THEME_CLASSES.indexOf(themeName);
applyThemeClass(idx === -1 ? 0 : idx);
}
document.addEventListener('DOMContentLoaded', () => {
const select = document.getElementById('theme-select');
if (select) {
// Initialisiere Auswahl aus localStorage
const idx = getCurrentThemeIndex();
select.value = THEME_CLASSES[idx];
applyThemeClass(idx);
// Theme bei Auswahl ändern
select.addEventListener('change', e => {
setThemeByName(e.target.value);
});
} else {
// Fallback: Theme trotzdem setzen
applyThemeClass(getCurrentThemeIndex());
}
});

View File

@@ -1,82 +0,0 @@
<!-- HTML header for doxygen 1.9.7-->
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "https://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" lang="$langISO">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=11"/>
<meta name="generator" content="Doxygen $doxygenversion"/>
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<!--BEGIN PROJECT_NAME--><title>$projectname: $title</title><!--END PROJECT_NAME-->
<!--BEGIN !PROJECT_NAME--><title>$title</title><!--END !PROJECT_NAME-->
<link href="$relpath^tabs.css" rel="stylesheet" type="text/css"/>
<!--BEGIN DISABLE_INDEX-->
<!--BEGIN FULL_SIDEBAR-->
<script type="text/javascript">var page_layout=1;</script>
<!--END FULL_SIDEBAR-->
<!--END DISABLE_INDEX-->
<script type="text/javascript" src="$relpath^jquery.js"></script>
<script type="text/javascript" src="$relpath^dynsections.js"></script>
$treeview
$search
$mathjax
$darkmode
<link href="$relpath^$stylesheet" rel="stylesheet" type="text/css" />
$extrastylesheet
<script type="text/javascript" src="$relpath^doxygen-awesome-darkmode-toggle.js"></script>
<script type="text/javascript">
DoxygenAwesomeDarkModeToggle.init()
</script>
<script type="text/javascript" src="$relpath^doxygen-awesome-interactive-toc.js"></script>
<script type="text/javascript">
DoxygenAwesomeInteractiveToc.init()
</script>
</head>
<body>
<!--BEGIN DISABLE_INDEX-->
<!--BEGIN FULL_SIDEBAR-->
<div id="side-nav" class="ui-resizable side-nav-resizable"><!-- do not remove this div, it is closed by doxygen! -->
<!--END FULL_SIDEBAR-->
<!--END DISABLE_INDEX-->
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<!--BEGIN TITLEAREA-->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr id="projectrow">
<!--BEGIN PROJECT_LOGO-->
<td id="projectlogo"><img alt="Logo" src="$relpath^$projectlogo"/></td>
<!--END PROJECT_LOGO-->
<!--BEGIN PROJECT_NAME-->
<td id="projectalign">
<div id="projectname">$projectname<!--BEGIN PROJECT_NUMBER--><span id="projectnumber">&#160;$projectnumber</span><!--END PROJECT_NUMBER-->
</div>
<!--BEGIN PROJECT_BRIEF--><div id="projectbrief">$projectbrief</div><!--END PROJECT_BRIEF-->
</td>
<!--END PROJECT_NAME-->
<!--BEGIN !PROJECT_NAME-->
<!--BEGIN PROJECT_BRIEF-->
<td>
<div id="projectbrief">$projectbrief</div>
</td>
<!--END PROJECT_BRIEF-->
<!--END !PROJECT_NAME-->
<!--BEGIN DISABLE_INDEX-->
<!--BEGIN SEARCHENGINE-->
<!--BEGIN !FULL_SIDEBAR-->
<td>$searchbox</td>
<!--END !FULL_SIDEBAR-->
<!--END SEARCHENGINE-->
<!--END DISABLE_INDEX-->
</tr>
<!--BEGIN SEARCHENGINE-->
<!--BEGIN FULL_SIDEBAR-->
<tr><td colspan="2">$searchbox</td></tr>
<!--END FULL_SIDEBAR-->
<!--END SEARCHENGINE-->
</tbody>
</table>
</div>
<!--END TITLEAREA-->
<!-- end header part -->

View File

@@ -76,38 +76,60 @@
"parallel_requests_limit": 10, // Optional parameter, used only if "processing_strategy" is "parallel". It limits the number of requests for one client connection processed in parallel. Infinite if not specified.
// Max number of responses to queue up before sent successfully. If a client's waiting queue is too long, the server will close the connection.
"ws_max_sending_queue_size": 1500,
"__ng_web_server": false // Use ng web server. This is a temporary setting which will be deleted after switching to ng web server
"__ng_web_server": false, // Use ng web server. This is a temporary setting which will be deleted after switching to ng web server
"proxy": {
"ips": [],
"tokens": []
}
},
// Time in seconds for graceful shutdown. Defaults to 10 seconds. Not fully implemented yet.
"graceful_period": 10.0,
// Overrides log level on a per logging channel.
// Defaults to global "log_level" for each unspecified channel.
"log_channels": [
{
"channel": "Backend",
"log_level": "fatal"
},
{
"channel": "WebServer",
"log_level": "info"
},
{
"channel": "Subscriptions",
"log_level": "info"
},
{
"channel": "RPC",
"log_level": "error"
},
{
"channel": "ETL",
"log_level": "debug"
},
{
"channel": "Performance",
"log_level": "trace"
}
],
"log": {
// Overrides log level on a per logging channel.
// Defaults to global "log.level" for each unspecified channel.
"channels": [
{
"channel": "Backend",
"level": "fatal"
},
{
"channel": "WebServer",
"level": "info"
},
{
"channel": "Subscriptions",
"level": "info"
},
{
"channel": "RPC",
"level": "error"
},
{
"channel": "ETL",
"level": "debug"
},
{
"channel": "Performance",
"level": "trace"
}
],
// The general logging level of Clio. This level is applied to all log channels that do not have an explicitly defined logging level.
"level": "info",
// Log format using spdlog format patterns (this is the default format)
"format": "%Y-%m-%d %H:%M:%S.%f %^%3!l:%n%$ - %v",
// Whether spdlog is asynchronous or not.
"is_async": true,
// Enables or disables logging to the console.
"enable_console": true,
// Clio logs to file in the specified directory only if "log.directory" is set
// "directory": "./clio_log",
// The log rotation size in megabytes. When the log file reaches this particular size, a new log file starts.
"rotation_size": 2048,
// The maximum number of log files in the directory.
"directory_max_files": 25,
// Log tags style to use
"tag_style": "uint"
},
"cache": {
// Configure this to use either "num_diffs", "num_cursors_from_diff", or "num_cursors_from_account". By default, Clio uses "num_diffs".
"num_diffs": 32, // Generate the cursors from the latest ledger diff, then use the cursors to partition the ledger to load concurrently. The cursors number is affected by the busyness of the network.
@@ -121,16 +143,6 @@
"enabled": true,
"compress_reply": true
},
"log_level": "info",
// Log format (this is the default format)
"log_format": "%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%",
"log_to_console": true,
// Clio logs to file in the specified directory only if "log_directory" is set
// "log_directory": "./clio_log",
"log_rotation_size": 2048,
"log_directory_max_size": 51200,
"log_rotation_hour_interval": 12,
"log_tag_style": "uint",
"extractor_threads": 8,
"read_only": false,
// "start_sequence": [integer] the ledger index to start from,

View File

@@ -0,0 +1,3 @@
.github-corner {
display: none !important;
}

View File

@@ -1,76 +0,0 @@
# Logging
Clio provides several logging options, which all are configurable via the config file. These are detailed in the following sections.
## `log_level`
The minimum level of severity at which the log message will be outputted by default. Severity options are `trace`, `debug`, `info`, `warning`, `error`, `fatal`. Defaults to `info`.
## `log_format`
The format of log lines produced by Clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
Each of the variables expands like so:
- `TimeStamp`: The full date and time of the log entry
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
- `ThreadID`: The ID of the thread the log entry is written from
- `Channel`: The channel that this log entry was sent to
- `Severity`: The severity (aka log level) the entry was sent at
- `Message`: The actual log message
## `log_channels`
An array of JSON objects, each overriding properties for a logging `channel`.
> [!IMPORTANT]
> At the time of writing, only `log_level` can be overridden using this mechanism.
Each object is of this format:
```json
{
"channel": "Backend",
"log_level": "fatal"
}
```
If no override is present for a given channel, that channel will log at the severity specified by the global `log_level`.
The log channels that can be overridden are: `Backend`, `WebServer`, `Subscriptions`, `RPC`, `ETL` and `Performance`.
> [!NOTE]
> See [example-config.json](../docs/examples/config/example-config.json) for more details.
## `log_to_console`
Enable or disable log output to console. Options are `true`/`false`. This option defaults to `true`.
## `log_directory`
Path to the directory where log files are stored. If such directory doesn't exist, Clio will create it.
If the option is not specified, the logs are not written to a file.
## `log_rotation_size`
The max size of the log file in **megabytes** before it will rotate into a smaller file. Defaults to 2GB.
## `log_directory_max_size`
The max size of the log directory in **megabytes** before old log files will be deleted to free up space. Defaults to 50GB.
## `log_rotation_hour_interval`
The time interval in **hours** after the last log rotation to automatically rotate the current log file. Defaults to 12 hours.
> [!NOTE]
> Log rotation based on time occurs in conjunction with size-based log rotation. For example, if a size-based log rotation occurs, the timer for the time-based rotation will reset.
## `log_tag_style`
Tag implementation to use. Must be one of:
- `uint`: Lock free and threadsafe but outputs just a simple unsigned integer
- `uuid`: Threadsafe and outputs a UUID tag
- `none`: Doesn't use tagging at all

View File

@@ -77,7 +77,7 @@ It's possible to configure `minimum`, `maximum` and `default` version like so:
All of the above are optional.
Clio will fallback to hardcoded defaults when these values are not specified in the config file, or if the configured values are outside of the minimum and maximum supported versions hardcoded in [src/rpc/common/APIVersion.h](../src/rpc/common/APIVersion.hpp).
Clio will fallback to hardcoded defaults when these values are not specified in the config file, or if the configured values are outside of the minimum and maximum supported versions hardcoded in [src/rpc/common/APIVersion.hpp](../src/rpc/common/APIVersion.hpp).
> [!TIP]
> See the [example-config.json](../docs/examples/config/example-config.json) for more details.

View File

@@ -36,19 +36,19 @@ EOF
exit 0
fi
# Check version of doxygen is at least 1.12
# Check version of doxygen is at least 1.14
version=$($DOXYGEN --version | grep -o '[0-9\.]*')
if [[ "1.12.0" > "$version" ]]; then
if [[ "1.14.0" > "$version" ]]; then
# No hard error if doxygen version is not the one we want - let CI deal with it
cat <<EOF
ERROR
-----------------------------------------------------------------------------
A minimum of version 1.12 of `which doxygen` is required.
Your version is $version. Please upgrade it for next time.
A minimum of version 1.14 of `which doxygen` is required.
Your version is $version. Please upgrade it.
Your changes may fail to pass CI once pushed.
Your changes may fail CI checks.
-----------------------------------------------------------------------------
EOF

View File

@@ -5,11 +5,14 @@ import re
from pathlib import Path
PATTERN = r'R"JSON\((.*?)\)JSON"'
def use_uppercase(cpp_content: str) -> str:
return cpp_content.replace('R"json(', 'R"JSON(').replace(')json"', ')JSON"')
def fix_json_style(cpp_content: str) -> str:
cpp_content = cpp_content.replace('R"json(', 'R"JSON(').replace(')json"', ')JSON"')
pattern = r'R"JSON\((.*?)\)JSON"'
def replace_json(match):
raw_json = match.group(1)
@@ -29,12 +32,51 @@ def fix_json_style(cpp_content: str) -> str:
raw_json = raw_json.replace(f'":{digit}', f'": {digit}')
return f'R"JSON({raw_json})JSON"'
return re.sub(pattern, replace_json, cpp_content, flags=re.DOTALL)
return re.sub(PATTERN, replace_json, cpp_content, flags=re.DOTALL)
def fix_colon_spacing(cpp_content: str) -> str:
def replace_json(match):
raw_json = match.group(1)
raw_json = re.sub(r'":\n\s*(\[|\{)', r'": \1', raw_json)
return f'R"JSON({raw_json})JSON"'
return re.sub(PATTERN, replace_json, cpp_content, flags=re.DOTALL)
def fix_indentation(cpp_content: str) -> str:
lines = cpp_content.splitlines()
def find_indentation(line: str) -> int:
return len(line) - len(line.lstrip())
for (line_num, (line, next_line)) in enumerate(zip(lines[:-1], lines[1:])):
if "JSON(" in line and ")JSON" not in line:
indent = find_indentation(line)
next_indent = find_indentation(next_line)
by_how_much = next_indent - (indent + 4)
if by_how_much != 0:
print(
f"Indentation error at line: {line_num + 2}: expected {indent + 4} spaces, found {next_indent} spaces"
)
for i in range(line_num + 1, len(lines)):
if ")JSON" in lines[i]:
lines[i] = " " * indent + lines[i].lstrip()
break
lines[i] = lines[i][by_how_much:] if by_how_much > 0 else " " * (-by_how_much) + lines[i]
return "\n".join(lines) + "\n"
def process_file(file_path: Path, dry_run: bool) -> bool:
content = file_path.read_text(encoding="utf-8")
new_content = fix_json_style(content)
new_content = content
new_content = use_uppercase(new_content)
new_content = fix_json_style(new_content)
new_content = fix_colon_spacing(new_content)
new_content = fix_indentation(new_content)
if new_content != content:
print(f"Processing file: {file_path}")

20
src/README.md Normal file
View File

@@ -0,0 +1,20 @@
# Clio API server
## Introduction
Clio is an XRP Ledger API server optimized for RPC calls over WebSocket or JSON-RPC.
It stores validated historical ledger and transaction data in a more space efficient format, and uses up to 4 times
less space than [rippled](https://github.com/XRPLF/rippled).
Clio can be configured to store data in [Apache Cassandra](https://cassandra.apache.org/_/index.html) or
[ScyllaDB](https://www.scylladb.com/), enabling scalable read throughput. Multiple Clio nodes can share
access to the same dataset, which allows for a highly available cluster of Clio nodes without the need for redundant
data storage or computation.
## Develop
As you prepare to develop code for Clio, please be sure you are aware of our current
[Contribution guidelines](https://github.com/XRPLF/clio/blob/develop/CONTRIBUTING.md).
Read about @ref "rpc" carefully to know more about writing your own handlers for Clio.

View File

@@ -23,6 +23,7 @@
#include "util/build/Build.hpp"
#include "util/config/ConfigDescription.hpp"
#include <boost/program_options/errors.hpp>
#include <boost/program_options/options_description.hpp>
#include <boost/program_options/parsers.hpp>
#include <boost/program_options/positional_options.hpp>
@@ -56,12 +57,22 @@ CliArgs::parse(int argc, char const* argv[])
po::positional_options_description positional;
positional.add("conf", 1);
auto const printHelp = [&description]() {
std::cout << "Clio server " << util::build::getClioFullVersionString() << "\n\n" << description;
};
po::variables_map parsed;
po::store(po::command_line_parser(argc, argv).options(description).positional(positional).run(), parsed);
po::notify(parsed);
try {
po::store(po::command_line_parser(argc, argv).options(description).positional(positional).run(), parsed);
po::notify(parsed);
} catch (po::error const& e) {
std::cerr << "Error: " << e.what() << std::endl << std::endl;
printHelp();
return Action{Action::Exit{EXIT_FAILURE}};
}
if (parsed.contains("help")) {
std::cout << "Clio server " << util::build::getClioFullVersionString() << "\n\n" << description;
printHelp();
return Action{Action::Exit{EXIT_SUCCESS}};
}

View File

@@ -178,7 +178,9 @@ ClioApplication::run(bool const useNgWebServer)
}
auto const adminVerifier = std::move(expectedAdminVerifier).value();
auto httpServer = web::ng::makeServer(config_, OnConnectCheck{dosGuard}, DisconnectHook{dosGuard}, ioc);
auto httpServer = web::ng::makeServer(
config_, OnConnectCheck{dosGuard}, IpChangeHook{dosGuard}, DisconnectHook{dosGuard}, ioc
);
if (not httpServer.has_value()) {
LOG(util::LogService::error()) << "Error creating web server: " << httpServer.error();
@@ -187,6 +189,7 @@ ClioApplication::run(bool const useNgWebServer)
httpServer->onGet("/metrics", MetricsHandler{adminVerifier});
httpServer->onGet("/health", HealthCheckHandler{});
httpServer->onGet("/cache_state", CacheStateHandler{cache});
auto requestHandler = RequestHandler{adminVerifier, handler};
httpServer->onPost("/", requestHandler);
httpServer->onWs(std::move(requestHandler));
@@ -212,7 +215,7 @@ ClioApplication::run(bool const useNgWebServer)
// Init the web server
auto handler = std::make_shared<web::RPCServerHandler<RPCEngineType>>(config_, backend, rpcEngine, etl, dosGuard);
auto const httpServer = web::makeHttpServer(config_, ioc, dosGuard, handler);
auto const httpServer = web::makeHttpServer(config_, ioc, dosGuard, handler, cache);
// Blocks until stopped.
// When stopped, shared_ptrs fall out of scope

View File

@@ -108,6 +108,8 @@ public:
ioc.stop();
LOG(util::LogService::info()) << "io_context stopped";
util::LogService::shutdown();
};
}
};

View File

@@ -33,6 +33,7 @@
#include <memory>
#include <optional>
#include <string>
#include <utility>
namespace app {
@@ -54,6 +55,17 @@ OnConnectCheck::operator()(web::ng::Connection const& connection)
return {};
}
IpChangeHook::IpChangeHook(web::dosguard::DOSGuardInterface& dosguard) : dosguard_(dosguard)
{
}
void
IpChangeHook::operator()(std::string const& oldIp, std::string const& newIp)
{
dosguard_.get().decrement(oldIp);
dosguard_.get().increment(newIp);
}
DisconnectHook::DisconnectHook(web::dosguard::DOSGuardInterface& dosguard) : dosguard_{dosguard}
{
}
@@ -108,4 +120,34 @@ HealthCheckHandler::operator()(
return web::ng::Response{boost::beast::http::status::ok, kHEALTH_CHECK_HTML, request};
}
web::ng::Response
CacheStateHandler::operator()(
web::ng::Request const& request,
web::ng::ConnectionMetadata&,
web::SubscriptionContextPtr,
boost::asio::yield_context
)
{
static constexpr auto kCACHE_CHECK_LOADED_HTML = R"html(
<!DOCTYPE html>
<html>
<head><title>Cache state</title></head>
<body><h1>Cache state</h1><p>Cache is fully loaded</p></body>
</html>
)html";
static constexpr auto kCACHE_CHECK_NOT_LOADED_HTML = R"html(
<!DOCTYPE html>
<html>
<head><title>Cache state</title></head>
<body><h1>Cache state</h1><p>Cache is not yet loaded</p></body>
</html>
)html";
if (cache_.get().isFull())
return web::ng::Response{boost::beast::http::status::ok, kCACHE_CHECK_LOADED_HTML, request};
return web::ng::Response{boost::beast::http::status::service_unavailable, kCACHE_CHECK_NOT_LOADED_HTML, request};
}
} // namespace app

View File

@@ -19,6 +19,7 @@
#pragma once
#include "data/LedgerCacheInterface.hpp"
#include "rpc/Errors.hpp"
#include "util/log/Logger.hpp"
#include "web/AdminVerificationStrategy.hpp"
@@ -36,6 +37,7 @@
#include <exception>
#include <functional>
#include <memory>
#include <string>
#include <utility>
namespace app {
@@ -64,6 +66,31 @@ public:
operator()(web::ng::Connection const& connection);
};
/**
* @brief A function object that is called when the IP of a connection changes (usually if proxy detected).
* This is used to update the DOS guard.
*/
class IpChangeHook {
std::reference_wrapper<web::dosguard::DOSGuardInterface> dosguard_;
public:
/**
* @brief Construct a new IpChangeHook object.
*
* @param dosguard The DOS guard to use.
*/
IpChangeHook(web::dosguard::DOSGuardInterface& dosguard);
/**
* @brief The call of the function object.
*
* @param oldIp The old IP of the connection.
* @param newIp The new IP of the connection.
*/
void
operator()(std::string const& oldIp, std::string const& newIp);
};
/**
* @brief A function object to be called when a connection is disconnected.
*/
@@ -137,6 +164,37 @@ public:
);
};
/**
* @brief A function object that handles the cache state check endpoint.
*/
class CacheStateHandler {
std::reference_wrapper<data::LedgerCacheInterface const> cache_;
public:
/**
* @brief Construct a new CacheStateHandler object.
*
* @param cache The ledger cache to use.
*/
CacheStateHandler(data::LedgerCacheInterface const& cache) : cache_{cache}
{
}
/**
* @brief The call of the function object.
*
* @param request The request to handle.
* @return The response to the request
*/
web::ng::Response
operator()(
web::ng::Request const& request,
web::ng::ConnectionMetadata&,
web::SubscriptionContextPtr,
boost::asio::yield_context
);
};
/**
* @brief A function object that handles the websocket endpoint.
*

View File

@@ -21,11 +21,16 @@
#include "cluster/ClioNode.hpp"
#include "data/BackendInterface.hpp"
#include "util/Assert.hpp"
#include "util/Spawn.hpp"
#include "util/log/Logger.hpp"
#include <boost/asio/bind_cancellation_slot.hpp>
#include <boost/asio/cancellation_type.hpp>
#include <boost/asio/error.hpp>
#include <boost/asio/spawn.hpp>
#include <boost/asio/steady_timer.hpp>
#include <boost/asio/use_future.hpp>
#include <boost/json/parse.hpp>
#include <boost/json/serialize.hpp>
#include <boost/json/value.hpp>
@@ -36,11 +41,16 @@
#include <chrono>
#include <ctime>
#include <latch>
#include <memory>
#include <string>
#include <utility>
#include <vector>
namespace {
constexpr auto kTOTAL_WORKERS = 2uz; // 1 reading and 1 writing worker (coroutines)
} // namespace
namespace cluster {
ClusterCommunicationService::ClusterCommunicationService(
@@ -51,6 +61,7 @@ ClusterCommunicationService::ClusterCommunicationService(
: backend_(std::move(backend))
, readInterval_(readInterval)
, writeInterval_(writeInterval)
, finishedCountdown_(kTOTAL_WORKERS)
, selfData_{ClioNode{
.uuid = std::make_shared<boost::uuids::uuid>(boost::uuids::random_generator{}()),
.updateTime = std::chrono::system_clock::time_point{}
@@ -63,22 +74,42 @@ ClusterCommunicationService::ClusterCommunicationService(
void
ClusterCommunicationService::run()
{
ASSERT(not running_ and not stopped_, "Can only be ran once");
running_ = true;
util::spawn(strand_, [this](boost::asio::yield_context yield) {
boost::asio::steady_timer timer(yield.get_executor());
while (true) {
boost::system::error_code ec;
while (running_) {
timer.expires_after(readInterval_);
timer.async_wait(yield);
auto token = cancelSignal_.slot();
timer.async_wait(boost::asio::bind_cancellation_slot(token, yield[ec]));
if (ec == boost::asio::error::operation_aborted or not running_)
break;
doRead(yield);
}
finishedCountdown_.count_down(1);
});
util::spawn(strand_, [this](boost::asio::yield_context yield) {
boost::asio::steady_timer timer(yield.get_executor());
while (true) {
boost::system::error_code ec;
while (running_) {
doWrite();
timer.expires_after(writeInterval_);
timer.async_wait(yield);
auto token = cancelSignal_.slot();
timer.async_wait(boost::asio::bind_cancellation_slot(token, yield[ec]));
if (ec == boost::asio::error::operation_aborted or not running_)
break;
}
finishedCountdown_.count_down(1);
});
}
@@ -93,9 +124,19 @@ ClusterCommunicationService::stop()
if (stopped_)
return;
ctx_.stop();
ctx_.join();
stopped_ = true;
// for ASAN to see through concurrency correctly we need to exit all coroutines before joining the ctx
running_ = false;
// cancelSignal_ is not thread safe so we execute emit on the same strand
boost::asio::spawn(
strand_, [this](auto&&) { cancelSignal_.emit(boost::asio::cancellation_type::all); }, boost::asio::use_future
)
.wait();
finishedCountdown_.wait();
ctx_.join();
}
std::shared_ptr<boost::uuids::uuid>

View File

@@ -27,12 +27,15 @@
#include "util/prometheus/Gauge.hpp"
#include "util/prometheus/Prometheus.hpp"
#include <boost/asio/cancellation_signal.hpp>
#include <boost/asio/spawn.hpp>
#include <boost/asio/strand.hpp>
#include <boost/asio/thread_pool.hpp>
#include <boost/uuid/uuid.hpp>
#include <atomic>
#include <chrono>
#include <latch>
#include <memory>
#include <string>
#include <vector>
@@ -65,11 +68,14 @@ class ClusterCommunicationService : public ClusterCommunicationServiceInterface
std::chrono::steady_clock::duration readInterval_;
std::chrono::steady_clock::duration writeInterval_;
boost::asio::cancellation_signal cancelSignal_;
std::latch finishedCountdown_;
std::atomic_bool running_ = false;
bool stopped_ = false;
ClioNode selfData_;
std::vector<ClioNode> otherNodesData_;
bool stopped_ = false;
public:
static constexpr std::chrono::milliseconds kDEFAULT_READ_INTERVAL{2100};
static constexpr std::chrono::milliseconds kDEFAULT_WRITE_INTERVAL{1200};

View File

@@ -68,7 +68,6 @@ struct Amendments {
/** @cond */
// NOLINTBEGIN(readability-identifier-naming)
REGISTER(OwnerPaysFee);
REGISTER(Flow);
REGISTER(FlowCross);
REGISTER(fix1513);
@@ -145,6 +144,9 @@ struct Amendments {
REGISTER(TokenEscrow);
REGISTER(fixAMMv1_3);
REGISTER(fixEnforceNFTokenTrustlineV2);
REGISTER(fixAMMClawbackRounding);
REGISTER(fixMPTDeliveredAmount);
REGISTER(fixPriceOracleOrder);
// Obsolete but supported by libxrpl
REGISTER(CryptoConditionsSuite);
@@ -153,6 +155,7 @@ struct Amendments {
REGISTER(fixNFTokenNegOffer);
// Retired amendments
REGISTER(OwnerPaysFee); // Removed in xrpl 2.6.0 (https://github.com/XRPLF/rippled/pull/5435)
REGISTER(MultiSign);
REGISTER(TrustSetAuth);
REGISTER(FeeEscalation);

View File

@@ -21,6 +21,7 @@
#include "data/BackendInterface.hpp"
#include "data/CassandraBackend.hpp"
#include "data/KeyspaceBackend.hpp"
#include "data/LedgerCacheInterface.hpp"
#include "data/cassandra/SettingsProvider.hpp"
#include "util/config/ConfigDefinition.hpp"
@@ -45,6 +46,7 @@ namespace data {
inline std::shared_ptr<BackendInterface>
makeBackend(util::config::ClioConfigDefinition const& config, data::LedgerCacheInterface& cache)
{
using namespace cassandra::impl;
static util::Logger const log{"Backend"}; // NOLINT(readability-identifier-naming)
LOG(log.info()) << "Constructing BackendInterface";
@@ -55,9 +57,15 @@ makeBackend(util::config::ClioConfigDefinition const& config, data::LedgerCacheI
if (boost::iequals(type, "cassandra")) {
auto const cfg = config.getObject("database." + type);
backend = std::make_shared<data::cassandra::CassandraBackend>(
data::cassandra::SettingsProvider{cfg}, cache, readOnly
);
if (providerFromString(cfg.getValueView("provider").asString()) == Provider::Keyspace) {
backend = std::make_shared<data::cassandra::KeyspaceBackend>(
data::cassandra::SettingsProvider{cfg}, cache, readOnly
);
} else {
backend = std::make_shared<data::cassandra::CassandraBackend>(
data::cassandra::SettingsProvider{cfg}, cache, readOnly
);
}
}
if (!backend)

View File

@@ -43,11 +43,6 @@
#include <utility>
#include <vector>
// local to compilation unit loggers
namespace {
util::Logger gLog{"Backend"};
} // namespace
/**
* @brief This namespace implements the data access layer and related components.
*
@@ -58,10 +53,10 @@ namespace data {
bool
BackendInterface::finishWrites(std::uint32_t const ledgerSequence)
{
LOG(gLog.debug()) << "Want finish writes for " << ledgerSequence;
LOG(log_.debug()) << "Want finish writes for " << ledgerSequence;
auto commitRes = doFinishWrites();
if (commitRes) {
LOG(gLog.debug()) << "Successfully committed. Updating range now to " << ledgerSequence;
LOG(log_.debug()) << "Successfully committed. Updating range now to " << ledgerSequence;
updateRange(ledgerSequence);
}
return commitRes;
@@ -89,15 +84,15 @@ BackendInterface::fetchLedgerObject(
{
auto obj = cache_.get().get(key, sequence);
if (obj) {
LOG(gLog.trace()) << "Cache hit - " << ripple::strHex(key);
LOG(log_.trace()) << "Cache hit - " << ripple::strHex(key);
return obj;
}
auto dbObj = doFetchLedgerObject(key, sequence, yield);
if (!dbObj) {
LOG(gLog.trace()) << "Missed cache and missed in db";
LOG(log_.trace()) << "Missed cache and missed in db";
} else {
LOG(gLog.trace()) << "Missed cache but found in db";
LOG(log_.trace()) << "Missed cache but found in db";
}
return dbObj;
}
@@ -111,7 +106,7 @@ BackendInterface::fetchLedgerObjectSeq(
{
auto seq = doFetchLedgerObjectSeq(key, sequence, yield);
if (!seq)
LOG(gLog.trace()) << "Missed in db";
LOG(log_.trace()) << "Missed in db";
return seq;
}
@@ -133,7 +128,7 @@ BackendInterface::fetchLedgerObjects(
misses.push_back(keys[i]);
}
}
LOG(gLog.trace()) << "Cache hits = " << keys.size() - misses.size() << " - cache misses = " << misses.size();
LOG(log_.trace()) << "Cache hits = " << keys.size() - misses.size() << " - cache misses = " << misses.size();
if (!misses.empty()) {
auto objs = doFetchLedgerObjects(misses, sequence, yield);
@@ -158,9 +153,9 @@ BackendInterface::fetchSuccessorKey(
{
auto succ = cache_.get().getSuccessor(key, ledgerSequence);
if (succ) {
LOG(gLog.trace()) << "Cache hit - " << ripple::strHex(key);
LOG(log_.trace()) << "Cache hit - " << ripple::strHex(key);
} else {
LOG(gLog.trace()) << "Cache miss - " << ripple::strHex(key);
LOG(log_.trace()) << "Cache miss - " << ripple::strHex(key);
}
return succ ? succ->key : doFetchSuccessorKey(key, ledgerSequence, yield);
}
@@ -210,7 +205,7 @@ BackendInterface::fetchBookOffers(
numSucc++;
succMillis += getMillis(mid2 - mid1);
if (!offerDir || offerDir->key >= bookEnd) {
LOG(gLog.trace()) << "offerDir.has_value() " << offerDir.has_value() << " breaking";
LOG(log_.trace()) << "offerDir.has_value() " << offerDir.has_value() << " breaking";
break;
}
uTipIndex = offerDir->key;
@@ -223,7 +218,7 @@ BackendInterface::fetchBookOffers(
keys.insert(keys.end(), indexes.begin(), indexes.end());
auto next = sle.getFieldU64(ripple::sfIndexNext);
if (next == 0u) {
LOG(gLog.trace()) << "Next is empty. breaking";
LOG(log_.trace()) << "Next is empty. breaking";
break;
}
auto nextKey = ripple::keylet::page(uTipIndex, next);
@@ -238,13 +233,13 @@ BackendInterface::fetchBookOffers(
auto mid = std::chrono::system_clock::now();
auto objs = fetchLedgerObjects(keys, ledgerSequence, yield);
for (size_t i = 0; i < keys.size() && i < limit; ++i) {
LOG(gLog.trace()) << "Key = " << ripple::strHex(keys[i]) << " blob = " << ripple::strHex(objs[i])
LOG(log_.trace()) << "Key = " << ripple::strHex(keys[i]) << " blob = " << ripple::strHex(objs[i])
<< " ledgerSequence = " << ledgerSequence;
ASSERT(!objs[i].empty(), "Ledger object can't be empty");
page.offers.push_back({keys[i], objs[i]});
}
auto end = std::chrono::system_clock::now();
LOG(gLog.debug()) << "Fetching " << std::to_string(keys.size()) << " offers took "
LOG(log_.debug()) << "Fetching " << std::to_string(keys.size()) << " offers took "
<< std::to_string(getMillis(mid - begin)) << " milliseconds. Fetching next dir took "
<< std::to_string(succMillis) << " milliseconds. Fetched next dir " << std::to_string(numSucc)
<< " times"
@@ -275,14 +270,17 @@ BackendInterface::updateRange(uint32_t newMax)
{
std::scoped_lock const lck(rngMtx_);
ASSERT(
!range_ || newMax >= range_->maxSequence,
"Range shouldn't exist yet or newMax should be greater. newMax = {}, range->maxSequence = {}",
newMax,
range_->maxSequence
);
if (range_.has_value() && newMax < range_->maxSequence) {
ASSERT(
false,
"Range shouldn't exist yet or newMax should be at least range->maxSequence. newMax = {}, "
"range->maxSequence = {}",
newMax,
range_->maxSequence
);
}
if (!range_) {
if (!range_.has_value()) {
range_ = {.minSequence = newMax, .maxSequence = newMax};
} else {
range_->maxSequence = newMax;
@@ -338,13 +336,13 @@ BackendInterface::fetchLedgerPage(
if (!objects[i].empty()) {
page.objects.push_back({keys[i], std::move(objects[i])});
} else if (!outOfOrder) {
LOG(gLog.error()) << "Deleted or non-existent object in successor table. key = " << ripple::strHex(keys[i])
LOG(log_.error()) << "Deleted or non-existent object in successor table. key = " << ripple::strHex(keys[i])
<< " - seq = " << ledgerSequence;
std::stringstream msg;
for (size_t j = 0; j < objects.size(); ++j) {
msg << " - " << ripple::strHex(keys[j]);
}
LOG(gLog.error()) << msg.str();
LOG(log_.error()) << msg.str();
if (corruptionDetector_.has_value())
corruptionDetector_->onCorruptionDetected();
@@ -365,7 +363,7 @@ BackendInterface::fetchFees(std::uint32_t const seq, boost::asio::yield_context
auto bytes = fetchLedgerObject(key, seq, yield);
if (!bytes) {
LOG(gLog.error()) << "Could not find fees";
LOG(log_.error()) << "Could not find fees";
return {};
}

View File

@@ -138,6 +138,7 @@ synchronousAndRetryOnTimeout(FnType&& func)
*/
class BackendInterface {
protected:
util::Logger log_{"Backend"};
mutable std::shared_mutex rngMtx_;
std::optional<LedgerRange> range_;
std::reference_wrapper<LedgerCacheInterface> cache_;
@@ -294,7 +295,7 @@ public:
* @param account The account to fetch transactions for
* @param limit The maximum number of transactions per result page
* @param forward Whether to fetch the page forwards or backwards from the given cursor
* @param cursor The cursor to resume fetching from
* @param txnCursor The cursor to resume fetching from
* @param yield The coroutine context
* @return Results and a cursor to resume from
*/
@@ -303,7 +304,7 @@ public:
ripple::AccountID const& account,
std::uint32_t limit,
bool forward,
std::optional<TransactionsCursor> const& cursor,
std::optional<TransactionsCursor> const& txnCursor,
boost::asio::yield_context yield
) const = 0;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,309 @@
//------------------------------------------------------------------------------
/*
This file is part of clio: https://github.com/XRPLF/clio
Copyright (c) 2025, the clio developers.
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
//==============================================================================
#pragma once
#include "data/LedgerHeaderCache.hpp"
#include "data/Types.hpp"
#include "data/cassandra/CassandraBackendFamily.hpp"
#include "data/cassandra/Concepts.hpp"
#include "data/cassandra/KeyspaceSchema.hpp"
#include "data/cassandra/SettingsProvider.hpp"
#include "data/cassandra/Types.hpp"
#include "data/cassandra/impl/ExecutionStrategy.hpp"
#include "util/Assert.hpp"
#include "util/log/Logger.hpp"
#include <boost/asio/spawn.hpp>
#include <boost/json/object.hpp>
#include <boost/uuid/string_generator.hpp>
#include <boost/uuid/uuid.hpp>
#include <cassandra.h>
#include <fmt/format.h>
#include <xrpl/basics/Blob.h>
#include <xrpl/basics/base_uint.h>
#include <xrpl/basics/strHex.h>
#include <xrpl/protocol/AccountID.h>
#include <xrpl/protocol/Indexes.h>
#include <xrpl/protocol/LedgerHeader.h>
#include <xrpl/protocol/nft.h>
#include <cstddef>
#include <cstdint>
#include <iterator>
#include <optional>
#include <stdexcept>
#include <utility>
#include <vector>
namespace data::cassandra {
/**
* @brief Implements @ref CassandraBackendFamily for Keyspace
*
* @tparam SettingsProviderType The settings provider type
* @tparam ExecutionStrategyType The execution strategy type
* @tparam FetchLedgerCacheType The ledger header cache type
*/
template <
SomeSettingsProvider SettingsProviderType,
SomeExecutionStrategy ExecutionStrategyType,
typename FetchLedgerCacheType = FetchLedgerCache>
class BasicKeyspaceBackend : public CassandraBackendFamily<
SettingsProviderType,
ExecutionStrategyType,
KeyspaceSchema<SettingsProviderType>,
FetchLedgerCacheType> {
using DefaultCassandraFamily = CassandraBackendFamily<
SettingsProviderType,
ExecutionStrategyType,
KeyspaceSchema<SettingsProviderType>,
FetchLedgerCacheType>;
using DefaultCassandraFamily::executor_;
using DefaultCassandraFamily::ledgerSequence_;
using DefaultCassandraFamily::log_;
using DefaultCassandraFamily::range_;
using DefaultCassandraFamily::schema_;
public:
/**
* @brief Inherit the constructors of the base class.
*/
using DefaultCassandraFamily::DefaultCassandraFamily;
/**
* @brief Move constructor is deleted because handle_ is shared by reference with executor
*/
BasicKeyspaceBackend(BasicKeyspaceBackend&&) = delete;
bool
doFinishWrites() override
{
this->waitForWritesToFinish();
// !range_.has_value() means the table 'ledger_range' is not populated;
// This would be the first write to the table.
// In this case, insert both min_sequence/max_sequence range into the table.
if (not range_.has_value()) {
executor_.writeSync(schema_->insertLedgerRange, /* isLatestLedger =*/false, ledgerSequence_);
executor_.writeSync(schema_->insertLedgerRange, /* isLatestLedger =*/true, ledgerSequence_);
}
if (not this->executeSyncUpdate(schema_->updateLedgerRange.bind(ledgerSequence_, true, ledgerSequence_ - 1))) {
log_.warn() << "Update failed for ledger " << ledgerSequence_;
return false;
}
log_.info() << "Committed ledger " << ledgerSequence_;
return true;
}
NFTsAndCursor
fetchNFTsByIssuer(
ripple::AccountID const& issuer,
std::optional<std::uint32_t> const& taxon,
std::uint32_t const ledgerSequence,
std::uint32_t const limit,
std::optional<ripple::uint256> const& cursorIn,
boost::asio::yield_context yield
) const override
{
std::vector<ripple::uint256> nftIDs;
if (taxon.has_value()) {
// Keyspace and ScyllaDB uses the same logic for taxon-filtered queries
nftIDs = fetchNFTIDsByTaxon(issuer, *taxon, limit, cursorIn, yield);
} else {
// Amazon Keyspaces Workflow for non-taxon queries
auto const startTaxon = cursorIn.has_value() ? ripple::nft::toUInt32(ripple::nft::getTaxon(*cursorIn)) : 0;
auto const startTokenID = cursorIn.value_or(ripple::uint256(0));
Statement const firstQuery = schema_->selectNFTIDsByIssuerTaxon.bind(issuer);
firstQuery.bindAt(1, startTaxon);
firstQuery.bindAt(2, startTokenID);
firstQuery.bindAt(3, Limit{limit});
auto const firstRes = executor_.read(yield, firstQuery);
if (firstRes.has_value()) {
for (auto const [nftID] : extract<ripple::uint256>(*firstRes))
nftIDs.push_back(nftID);
}
if (nftIDs.size() < limit) {
auto const remainingLimit = limit - nftIDs.size();
Statement const secondQuery = schema_->selectNFTsAfterTaxonKeyspaces.bind(issuer);
secondQuery.bindAt(1, startTaxon);
secondQuery.bindAt(2, Limit{remainingLimit});
auto const secondRes = executor_.read(yield, secondQuery);
if (secondRes.has_value()) {
for (auto const [nftID] : extract<ripple::uint256>(*secondRes))
nftIDs.push_back(nftID);
}
}
}
return populateNFTsAndCreateCursor(nftIDs, ledgerSequence, limit, yield);
}
/**
* @brief (Unsupported in Keyspaces) Fetches account root object indexes by page.
* @note Loading the cache by enumerating all accounts is currently unsupported by the AWS Keyspaces backend.
* This function's logic relies on "PER PARTITION LIMIT 1", which Keyspaces does not support, and there is
* no efficient alternative. This is acceptable as the cache is primarily loaded via diffs. Calling this
* function will throw an exception.
*
* @param number The total number of accounts to fetch.
* @param pageSize The maximum number of accounts per page.
* @param seq The accounts need to exist at this ledger sequence.
* @param yield The coroutine context.
* @return A vector of ripple::uint256 representing the account root hashes.
*/
std::vector<ripple::uint256>
fetchAccountRoots(
[[maybe_unused]] std::uint32_t number,
[[maybe_unused]] std::uint32_t pageSize,
[[maybe_unused]] std::uint32_t seq,
[[maybe_unused]] boost::asio::yield_context yield
) const override
{
ASSERT(false, "Fetching account roots is not supported by the Keyspaces backend.");
std::unreachable();
}
private:
std::vector<ripple::uint256>
fetchNFTIDsByTaxon(
ripple::AccountID const& issuer,
std::uint32_t const taxon,
std::uint32_t const limit,
std::optional<ripple::uint256> const& cursorIn,
boost::asio::yield_context yield
) const
{
std::vector<ripple::uint256> nftIDs;
Statement const statement = schema_->selectNFTIDsByIssuerTaxon.bind(issuer);
statement.bindAt(1, taxon);
statement.bindAt(2, cursorIn.value_or(ripple::uint256(0)));
statement.bindAt(3, Limit{limit});
auto const res = executor_.read(yield, statement);
if (res.has_value() && res->hasRows()) {
for (auto const [nftID] : extract<ripple::uint256>(*res))
nftIDs.push_back(nftID);
}
return nftIDs;
}
std::vector<ripple::uint256>
fetchNFTIDsWithoutTaxon(
ripple::AccountID const& issuer,
std::uint32_t const limit,
std::optional<ripple::uint256> const& cursorIn,
boost::asio::yield_context yield
) const
{
std::vector<ripple::uint256> nftIDs;
auto const startTaxon = cursorIn.has_value() ? ripple::nft::toUInt32(ripple::nft::getTaxon(*cursorIn)) : 0;
auto const startTokenID = cursorIn.value_or(ripple::uint256(0));
Statement firstQuery = schema_->selectNFTIDsByIssuerTaxon.bind(issuer);
firstQuery.bindAt(1, startTaxon);
firstQuery.bindAt(2, startTokenID);
firstQuery.bindAt(3, Limit{limit});
auto const firstRes = executor_.read(yield, firstQuery);
if (firstRes.has_value()) {
for (auto const [nftID] : extract<ripple::uint256>(*firstRes))
nftIDs.push_back(nftID);
}
if (nftIDs.size() < limit) {
auto const remainingLimit = limit - nftIDs.size();
Statement secondQuery = schema_->selectNFTsAfterTaxonKeyspaces.bind(issuer);
secondQuery.bindAt(1, startTaxon);
secondQuery.bindAt(2, Limit{remainingLimit});
auto const secondRes = executor_.read(yield, secondQuery);
if (secondRes.has_value()) {
for (auto const [nftID] : extract<ripple::uint256>(*secondRes))
nftIDs.push_back(nftID);
}
}
return nftIDs;
}
/**
* @brief Takes a list of NFT IDs, fetches their full data, and assembles the final result with a cursor.
*/
NFTsAndCursor
populateNFTsAndCreateCursor(
std::vector<ripple::uint256> const& nftIDs,
std::uint32_t const ledgerSequence,
std::uint32_t const limit,
boost::asio::yield_context yield
) const
{
if (nftIDs.empty()) {
LOG(log_.debug()) << "No rows returned";
return {};
}
NFTsAndCursor ret;
if (nftIDs.size() == limit)
ret.cursor = nftIDs.back();
// Prepare and execute queries to fetch NFT info and URIs in parallel.
std::vector<Statement> selectNFTStatements;
selectNFTStatements.reserve(nftIDs.size());
std::transform(
std::cbegin(nftIDs), std::cend(nftIDs), std::back_inserter(selectNFTStatements), [&](auto const& nftID) {
return schema_->selectNFT.bind(nftID, ledgerSequence);
}
);
std::vector<Statement> selectNFTURIStatements;
selectNFTURIStatements.reserve(nftIDs.size());
std::transform(
std::cbegin(nftIDs), std::cend(nftIDs), std::back_inserter(selectNFTURIStatements), [&](auto const& nftID) {
return schema_->selectNFTURI.bind(nftID, ledgerSequence);
}
);
auto const nftInfos = executor_.readEach(yield, selectNFTStatements);
auto const nftUris = executor_.readEach(yield, selectNFTURIStatements);
// Combine the results into final NFT objects.
for (auto i = 0u; i < nftIDs.size(); ++i) {
if (auto const maybeRow = nftInfos[i].template get<uint32_t, ripple::AccountID, bool>();
maybeRow.has_value()) {
auto [seq, owner, isBurned] = *maybeRow;
NFT nft(nftIDs[i], seq, owner, isBurned);
if (auto const maybeUri = nftUris[i].template get<ripple::Blob>(); maybeUri.has_value())
nft.uri = *maybeUri;
ret.nfts.push_back(nft);
}
}
return ret;
}
};
using KeyspaceBackend = BasicKeyspaceBackend<SettingsProvider, impl::DefaultExecutionStrategy<>>;
} // namespace data::cassandra

View File

@@ -1,8 +1,10 @@
# Backend
# Backend
@page "backend" Backend
The backend of Clio is responsible for handling the proper reading and writing of past ledger data from and to a given database. Currently, Cassandra and ScyllaDB are the only supported databases that are production-ready.
To support additional database types, you can create new classes that implement the virtual methods in [BackendInterface.h](https://github.com/XRPLF/clio/blob/develop/src/data/BackendInterface.hpp). Then, leveraging the Factory Object Design Pattern, modify [BackendFactory.h](https://github.com/XRPLF/clio/blob/develop/src/data/BackendFactory.hpp) with logic that returns the new database interface if the relevant `type` is provided in Clio's configuration file.
To support additional database types, you can create new classes that implement the virtual methods in [BackendInterface.hpp](https://github.com/XRPLF/clio/blob/develop/src/data/BackendInterface.hpp). Then, leveraging the Factory Object Design Pattern, modify [BackendFactory.hpp](https://github.com/XRPLF/clio/blob/develop/src/data/BackendFactory.hpp) with logic that returns the new database interface if the relevant `type` is provided in Clio's configuration file.
## Data Model

Some files were not shown because too many files have changed in this diff Show More