Compare commits

..

56 Commits

Author SHA1 Message Date
Peter Chen
d001e35427 fix: authorized_credential elements in array not objects bug (#1744) (#1747)
fixes: #1743
2024-11-21 12:05:15 -05:00
Peter Chen
592af70f03 fix: Credential error message (#1738)
fixes #1737
2024-11-18 16:15:11 +00:00
Alex Kremer
33cf336964 feat: Upgrade to libxrpl 2.3.0-rc2 (#1736) 2024-11-18 16:13:59 +00:00
github-actions[bot]
16e07b90db style: clang-tidy auto fixes (#1735)
Fixes #1734. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-18 16:13:47 +00:00
Peter Chen
39419c8b58 feat: Add Support Credentials for Clio (#1712)
Rippled PR: [here](https://github.com/XRPLF/rippled/pull/5103)
2024-11-18 16:13:09 +00:00
github-actions[bot]
e38658a0d6 style: clang-tidy auto fixes (#1730)
Fixes #1729. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-18 16:12:40 +00:00
Shawn Xie
fb98a6a394 feat: Implement MPT changes (#1147)
Implements https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0033d-multi-purpose-tokens
2024-11-11 17:27:04 +00:00
dependabot[bot]
b8a8248c42 ci: Bump wandalen/wretry.action from 3.7.0 to 3.7.2 (#1723)
Bumps
[wandalen/wretry.action](https://github.com/wandalen/wretry.action) from
3.7.0 to 3.7.2.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8ceaefd717"><code>8ceaefd</code></a>
version 3.7.2</li>
<li><a
href="ce976ac9e7"><code>ce976ac</code></a>
version 3.7.1</li>
<li><a
href="7a8f8d4bf2"><code>7a8f8d4</code></a>
Merge pull request <a
href="https://redirect.github.com/wandalen/wretry.action/issues/174">#174</a>
from dmvict/master</li>
<li><a
href="2103bce855"><code>2103bce</code></a>
Fix action, add option <code>pre_retry_command</code> to call of
subaction</li>
<li>See full diff in <a
href="https://github.com/wandalen/wretry.action/compare/v3.7.0...v3.7.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=wandalen/wretry.action&package-manager=github_actions&previous-version=3.7.0&new-version=3.7.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
2024-11-11 17:27:04 +00:00
Alex Kremer
a092b7ae08 feat: Upgrade to libxrpl 2.3.0-rc1 (#1718)
Fixes #1717
2024-11-11 17:27:04 +00:00
github-actions[bot]
07438a2e02 style: clang-tidy auto fixes (#1720)
Fixes #1719. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-11 17:27:04 +00:00
Alex Kremer
09aa688de4 feat: ETLng Registry (#1713)
For #1597
2024-11-11 17:27:03 +00:00
dependabot[bot]
a7bff26fd6 ci: Bump wandalen/wretry.action from 3.5.0 to 3.7.0 (#1714)
Bumps
[wandalen/wretry.action](https://github.com/wandalen/wretry.action) from
3.5.0 to 3.7.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f8754f7974"><code>f8754f7</code></a>
version 3.7.0</li>
<li><a
href="03db9837ed"><code>03db983</code></a>
Merge pull request <a
href="https://redirect.github.com/wandalen/wretry.action/issues/171">#171</a>
from dmvict/docker_readme</li>
<li><a
href="d80901cd5c"><code>d80901c</code></a>
Sync readme for new feature</li>
<li><a
href="e00d406ade"><code>e00d406</code></a>
version 3.6.0</li>
<li><a
href="e00deaa9ba"><code>e00deaa</code></a>
Merge pull request <a
href="https://redirect.github.com/wandalen/wretry.action/issues/170">#170</a>
from dmvict/pre_retry_action</li>
<li><a
href="8b50f3152e"><code>8b50f31</code></a>
Update action, add option <code>pre_retry_command</code> to run command
between retries</li>
<li><a
href="990f16983d"><code>990f169</code></a>
Merge pull request <a
href="https://redirect.github.com/wandalen/wretry.action/issues/167">#167</a>
from Vampire/add-typing</li>
<li><a
href="aeb34f4d13"><code>aeb34f4</code></a>
Add action typing</li>
<li>See full diff in <a
href="https://github.com/wandalen/wretry.action/compare/v3.5.0...v3.7.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=wandalen/wretry.action&package-manager=github_actions&previous-version=3.5.0&new-version=3.7.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
2024-11-11 17:27:03 +00:00
Peter Chen
081adf1cae fix: Support Delete NFT (#1695)
Fixes #1677
2024-11-11 17:27:03 +00:00
cyan317
ffc9deb0f8 fix: Add queue size limit for websocket (#1701)
For slow clients, we will disconnect with it if the message queue is too
long.

---------

Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
2024-11-11 17:27:02 +00:00
github-actions[bot]
717a29ecdf style: clang-tidy auto fixes (#1711)
Fixes #1710.

---------

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
2024-11-11 17:27:02 +00:00
Sergey Kuznetsov
e8db74456a ci: Fix nightly build (#1709)
Fixes #1703.
2024-11-11 17:27:02 +00:00
Sergey Kuznetsov
4947a83696 fix: Fix issues clang-tidy found (#1708)
Fixes #1706.
2024-11-11 17:27:01 +00:00
github-actions[bot]
164387cab0 style: clang-tidy auto fixes (#1705)
Fixes #1704. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-11 17:27:01 +00:00
Sergey Kuznetsov
b8f1deb90f refactor: Coroutine based webserver (#1699)
Code of new coroutine-based web server. The new server is not connected
to Clio and not ready to use yet.
For #919.
2024-11-11 17:27:01 +00:00
Sergey Kuznetsov
5c77e59374 fix: Fix timer spurious calls (#1700)
Fixes #1634.
I also checked other timers and they don't have the issue.
2024-11-11 17:27:00 +00:00
Peter Chen
6d070132c7 fix: example config syntax (#1696) 2024-11-11 17:27:00 +00:00
cyan317
d2dda69448 fix: Remove log (#1694) 2024-11-11 17:27:00 +00:00
cyan317
e2aeaa0956 chore: Add counter for total messages waiting to be sent (#1691) 2024-11-11 17:27:00 +00:00
Sergey Kuznetsov
2951b4aaa0 style: Fix include (#1687)
Fixes #1686
2024-11-11 17:26:59 +00:00
cyan317
6c3c761dd1 chore: Remove unused static variables (#1683) 2024-11-11 17:26:59 +00:00
github-actions[bot]
527020680a style: clang-tidy auto fixes (#1685)
Fixes #1684. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-11 17:26:59 +00:00
Alex Kremer
401448f771 style: Update code formatting (#1682)
For #1664
2024-11-11 17:26:58 +00:00
Alex Kremer
0f12a6d7f2 chore: Upgrade to llvm 19 tooling (#1681)
For #1664
2024-11-11 17:26:58 +00:00
Peter Chen
5c8fc939f2 fix: deletion script will not OOM (#1679)
fixes #1676 and #1678
2024-11-11 17:26:58 +00:00
github-actions[bot]
b1be848098 style: clang-tidy auto fixes (#1674)
Fixes #1673. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-11-11 17:26:58 +00:00
cyan317
41aabbfcce feat: server info cache (#1671)
fix: #1181
2024-11-11 17:26:57 +00:00
dependabot[bot]
c00d25aa6b chore: Bump ytanikin/PRConventionalCommits from 1.2.0 to 1.3.0 (#1670)
Bumps
[ytanikin/PRConventionalCommits](https://github.com/ytanikin/prconventionalcommits)
from 1.2.0 to 1.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/ytanikin/prconventionalcommits/releases">ytanikin/PRConventionalCommits's
releases</a>.</em></p>
<blockquote>
<h2>1.3.0</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: Set breaking changes regex by <a
href="https://github.com/alexangas"><code>@​alexangas</code></a> in <a
href="https://redirect.github.com/ytanikin/PRConventionalCommits/pull/24">ytanikin/PRConventionalCommits#24</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/alexangas"><code>@​alexangas</code></a>
made their first contribution in <a
href="https://redirect.github.com/ytanikin/PRConventionalCommits/pull/24">ytanikin/PRConventionalCommits#24</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/ytanikin/PRConventionalCommits/compare/1.2.0...1.3.0">https://github.com/ytanikin/PRConventionalCommits/compare/1.2.0...1.3.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b628c5a234"><code>b628c5a</code></a>
test: enable &quot;breaking change&quot; test</li>
<li><a
href="e1b5683aa4"><code>e1b5683</code></a>
fix: Set breaking changes regex (<a
href="https://redirect.github.com/ytanikin/prconventionalcommits/issues/24">#24</a>)</li>
<li><a
href="92a7ab7dc6"><code>92a7ab7</code></a>
fix: upgrade dependencies (<a
href="https://redirect.github.com/ytanikin/prconventionalcommits/issues/26">#26</a>)</li>
<li><a
href="cc6cc0dddb"><code>cc6cc0d</code></a>
test: fix tests (<a
href="https://redirect.github.com/ytanikin/prconventionalcommits/issues/25">#25</a>)</li>
<li>See full diff in <a
href="https://github.com/ytanikin/prconventionalcommits/compare/1.2.0...1.3.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ytanikin/PRConventionalCommits&package-manager=github_actions&previous-version=1.2.0&new-version=1.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 17:26:57 +00:00
Alex Kremer
8d5c588e35 chore: Apply commits for 2.3.0-b4 (#1725) 2024-11-11 14:37:31 +00:00
Sergey Kuznetsov
9df3e936cc chore: Update libxrpl to 2.3.0-b4 (#1667) 2024-09-25 14:44:03 +01:00
Alex Kremer
4166c46820 fix: Workaround for gcc12 bug with defaulted destructors (#1666)
Fixes #1662
2024-09-25 14:44:03 +01:00
github-actions[bot]
f75cbd456b style: clang-tidy auto fixes (#1663)
Fixes #1662.

---------

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
Co-authored-by: Peter Chen <ychen@ripple.com>
2024-09-25 14:44:03 +01:00
Peter Chen
d189651821 fix: add no lint to ignore clang-tidy (#1660)
Fixes build for
[#1659](https://github.com/XRPLF/clio/actions/runs/10956058143/job/30421296417)
2024-09-25 14:44:02 +01:00
github-actions[bot]
3f791c1315 style: clang-tidy auto fixes (#1659)
Fixes #1658. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-09-25 14:44:02 +01:00
Peter Chen
418511332e chore: Revert Cassandra driver upgrade (#1656)
Reverts XRPLF/clio#1646
2024-09-25 14:44:02 +01:00
Peter Chen
e5a0477352 refactor: Clio Config (#1593)
Add constraint + parse json into Config
Second part of refactoring Clio Config; First PR found
[here](https://github.com/XRPLF/clio/pull/1544)

Steps that are left to implement:
- Replacing all the places where we fetch config values (by using
config.valueOr/MaybeValue) to instead get it from Config Definition
- Generate markdown file using Clio Config Description
2024-09-25 14:44:02 +01:00
cyan317
3118110eb8 feat: add 'force_forward' field to request (#1647)
Fix #1141
2024-09-25 14:44:01 +01:00
Alex Kremer
6d20f39f67 feat: Delete-before support in data removal tool (#1649)
Fixes #1650
2024-09-25 14:44:01 +01:00
Peter Chen
9cb1e06c8e fix: Upgrade Cassandra driver (#1646)
Fixes #1296
2024-09-25 14:44:01 +01:00
Peter Chen
423244eb4b fix: pre-push tag (#1614)
Fix issue of git was verifying incorrect Tag
2024-09-25 14:44:01 +01:00
cyan317
7aaba1cbad fix: no restriction on type field (#1644)
'type' should not matter if 'full' or 'accounts' is false. Relax the
restriction for 'type'
2024-09-25 14:44:00 +01:00
cyan317
b7c50fd73d fix: Add more restrictions to admin fields (#1643) 2024-09-25 14:44:00 +01:00
dependabot[bot]
442ee874d5 ci: Bump peter-evans/create-pull-request from 6 to 7 (#1636)
Bumps
[peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request)
from 6 to 7.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/peter-evans/create-pull-request/releases">peter-evans/create-pull-request's
releases</a>.</em></p>
<blockquote>
<h2>Create Pull Request v7.0.0</h2>
<p> Now supports commit signing with bot-generated tokens! See
&quot;What's new&quot; below. ✍️🤖</p>
<h3>Behaviour changes</h3>
<ul>
<li>Action input <code>git-token</code> has been renamed
<code>branch-token</code>, to be more clear about its purpose. The
<code>branch-token</code> is the token that the action will use to
create and update the branch.</li>
<li>The action now handles requests that have been rate-limited by
GitHub. Requests hitting a primary rate limit will retry twice, for a
total of three attempts. Requests hitting a secondary rate limit will
not be retried.</li>
<li>The <code>pull-request-operation</code> output now returns
<code>none</code> when no operation was executed.</li>
<li>Removed deprecated output environment variable
<code>PULL_REQUEST_NUMBER</code>. Please use the
<code>pull-request-number</code> action output instead.</li>
</ul>
<h3>What's new</h3>
<ul>
<li>The action can now sign commits as <code>github-actions[bot]</code>
when using <code>GITHUB_TOKEN</code>, or your own bot when using <a
href="https://github.com/peter-evans/create-pull-request/blob/HEAD/docs/concepts-guidelines.md#authenticating-with-github-app-generated-tokens">GitHub
App tokens</a>. See <a
href="https://github.com/peter-evans/create-pull-request/blob/HEAD/docs/concepts-guidelines.md#commit-signature-verification-for-bots">commit
signing</a> for details.</li>
<li>Action input <code>draft</code> now accepts a new value
<code>always-true</code>. This will set the pull request to draft status
when the pull request is updated, as well as on creation.</li>
<li>A new action input <code>maintainer-can-modify</code> indicates
whether <a
href="https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/allowing-changes-to-a-pull-request-branch-created-from-a-fork">maintainers
can modify</a> the pull request. The default is <code>true</code>, which
retains the existing behaviour of the action.</li>
<li>A new output <code>pull-request-commits-verified</code> returns
<code>true</code> or <code>false</code>, indicating whether GitHub
considers the signature of the branch's commits to be verified.</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.36 to
18.19.39 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3000">peter-evans/create-pull-request#3000</a></li>
<li>build(deps-dev): bump ts-jest from 29.1.5 to 29.2.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3008">peter-evans/create-pull-request#3008</a></li>
<li>build(deps-dev): bump prettier from 3.3.2 to 3.3.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3018">peter-evans/create-pull-request#3018</a></li>
<li>build(deps-dev): bump ts-jest from 29.2.0 to 29.2.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3019">peter-evans/create-pull-request#3019</a></li>
<li>build(deps-dev): bump eslint-plugin-prettier from 5.1.3 to 5.2.1 by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3035">peter-evans/create-pull-request#3035</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.39 to
18.19.41 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3037">peter-evans/create-pull-request#3037</a></li>
<li>build(deps): bump undici from 6.19.2 to 6.19.4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3036">peter-evans/create-pull-request#3036</a></li>
<li>build(deps-dev): bump ts-jest from 29.2.2 to 29.2.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3038">peter-evans/create-pull-request#3038</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.41 to
18.19.42 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3070">peter-evans/create-pull-request#3070</a></li>
<li>build(deps): bump undici from 6.19.4 to 6.19.5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3086">peter-evans/create-pull-request#3086</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.42 to
18.19.43 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3087">peter-evans/create-pull-request#3087</a></li>
<li>build(deps-dev): bump ts-jest from 29.2.3 to 29.2.4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3088">peter-evans/create-pull-request#3088</a></li>
<li>build(deps): bump undici from 6.19.5 to 6.19.7 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3145">peter-evans/create-pull-request#3145</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.43 to
18.19.44 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3144">peter-evans/create-pull-request#3144</a></li>
<li>Update distribution by <a
href="https://github.com/actions-bot"><code>@​actions-bot</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3154">peter-evans/create-pull-request#3154</a></li>
<li>build(deps): bump undici from 6.19.7 to 6.19.8 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3213">peter-evans/create-pull-request#3213</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.44 to
18.19.45 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3214">peter-evans/create-pull-request#3214</a></li>
<li>Update distribution by <a
href="https://github.com/actions-bot"><code>@​actions-bot</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3221">peter-evans/create-pull-request#3221</a></li>
<li>build(deps-dev): bump eslint-import-resolver-typescript from 3.6.1
to 3.6.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3255">peter-evans/create-pull-request#3255</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 18.19.45 to
18.19.46 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3254">peter-evans/create-pull-request#3254</a></li>
<li>build(deps-dev): bump ts-jest from 29.2.4 to 29.2.5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3256">peter-evans/create-pull-request#3256</a></li>
<li>v7 - signed commits by <a
href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3057">peter-evans/create-pull-request#3057</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/rustycl0ck"><code>@​rustycl0ck</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3057">peter-evans/create-pull-request#3057</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/peter-evans/create-pull-request/compare/v6.1.0...v7.0.0">https://github.com/peter-evans/create-pull-request/compare/v6.1.0...v7.0.0</a></p>
<h2>Create Pull Request v6.1.0</h2>
<p> Adds <code>pull-request-branch</code> as an action output.</p>
<h2>What's Changed</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8867c4aba1"><code>8867c4a</code></a>
fix: handle ambiguous argument failure on diff stat (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3312">#3312</a>)</li>
<li><a
href="6073f5434b"><code>6073f54</code></a>
build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3291">#3291</a>)</li>
<li><a
href="6d01b5601c"><code>6d01b56</code></a>
build(deps-dev): bump eslint-plugin-import from 2.29.1 to 2.30.0 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3290">#3290</a>)</li>
<li><a
href="25cf8451c3"><code>25cf845</code></a>
build(deps-dev): bump <code>@​typescript-eslint/parser</code> from
7.17.0 to 7.18.0 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3289">#3289</a>)</li>
<li><a
href="d87b980a0e"><code>d87b980</code></a>
build(deps-dev): bump <code>@​types/node</code> from 18.19.46 to
18.19.48 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3288">#3288</a>)</li>
<li><a
href="119d131ea9"><code>119d131</code></a>
build(deps): bump peter-evans/create-pull-request from 6 to 7 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3283">#3283</a>)</li>
<li><a
href="73e6230af4"><code>73e6230</code></a>
docs: update readme</li>
<li><a
href="c0348e860f"><code>c0348e8</code></a>
ci: add v7 to workflow</li>
<li><a
href="4320041ed3"><code>4320041</code></a>
feat: signed commits (v7) (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3057">#3057</a>)</li>
<li><a
href="0c2a66fe4a"><code>0c2a66f</code></a>
build(deps-dev): bump ts-jest from 29.2.4 to 29.2.5 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/3256">#3256</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/peter-evans/create-pull-request/compare/v6...v7">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=peter-evans/create-pull-request&package-manager=github_actions&previous-version=6&new-version=7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-25 14:44:00 +01:00
cyan317
0679034978 fix: Don't forward ledger API if 'full' is a string (#1640)
Fix #1635
2024-09-25 14:43:59 +01:00
github-actions[bot]
b41ea34212 style: clang-tidy auto fixes (#1639)
Fixes #1638. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-09-25 14:43:59 +01:00
Sergey Kuznetsov
4e147deafa fix: Subscription source bugs fix (#1626) (#1633)
Fixes #1620.
Cherry pick of #1626 into develop.

- Add timeouts for websocket operations for connections to rippled.
Without these timeouts if connection hangs for some reason, clio
wouldn't know the connection is hanging.
- Fix potential data race in choosing new subscription source which will
forward messages to users.
- Optimise switching between subscription sources.
2024-09-25 14:43:59 +01:00
Sergey Kuznetsov
b08447e8e0 fix: Fix logging in SubscriptionSource (#1617) (#1632)
Fixes #1616. 
Cherry pick of #1617 into develop.
2024-09-25 14:43:59 +01:00
cyan317
9432165ace refactor: Remove SubscriptionManagerRunner (#1623) 2024-09-25 14:43:58 +01:00
github-actions[bot]
3b6a87249c style: clang-tidy auto fixes (#1631)
Fixes #1630. Please review and commit clang-tidy fixes.

Co-authored-by: kuznetsss <15742918+kuznetsss@users.noreply.github.com>
2024-09-25 14:43:58 +01:00
Sergey Kuznetsov
b7449f72b7 test: Add test for WsConnection for ping response (#1619) 2024-09-25 14:43:58 +01:00
cyan317
443c74436e fix: not forward admin API (#1628) 2024-09-25 14:43:57 +01:00
Peter Chen
7b5e02731d fix: AccountNFT with invalid marker (#1589)
Fixes [#1497](https://github.com/XRPLF/clio/issues/1497)
Mimics the behavior of the [fix on Rippled
side](https://github.com/XRPLF/rippled/pull/5045)
2024-08-27 15:38:19 -04:00
761 changed files with 17517 additions and 49815 deletions

View File

@@ -1,5 +1,5 @@
---
Language: Cpp
Language: Cpp
AccessModifierOffset: -4
AlignAfterOpenBracket: BlockIndent
AlignConsecutiveAssignments: false
@@ -22,31 +22,31 @@ BreakBeforeBinaryOperators: false
BreakBeforeBraces: WebKit
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: true
ColumnLimit: 120
CommentPragmas: "^ IWYU pragma:"
ColumnLimit: 120
CommentPragmas: '^ IWYU pragma:'
ConstructorInitializerAllOnOneLineOrOnePerLine: true
ConstructorInitializerIndentWidth: 4
ContinuationIndentWidth: 4
Cpp11BracedListStyle: true
DerivePointerAlignment: false
DisableFormat: false
DisableFormat: false
ExperimentalAutoDetectBinPacking: false
FixNamespaceComments: true
ForEachMacros: [Q_FOREACH, BOOST_FOREACH]
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
IncludeBlocks: Regroup
IncludeCategories:
- Regex: '^".*"$'
Priority: 1
- Regex: '^<.*\.(h|hpp)>$'
Priority: 2
- Regex: "^<.*>$"
Priority: 3
- Regex: ".*"
Priority: 4
IncludeIsMainRegex: "$"
- Regex: '^".*"$'
Priority: 1
- Regex: '^<.*\.(h|hpp)>$'
Priority: 2
- Regex: '^<.*>$'
Priority: 3
- Regex: '.*'
Priority: 4
IncludeIsMainRegex: '$'
IndentCaseLabels: true
IndentFunctionDeclarationAfterType: false
IndentWidth: 4
IndentWidth: 4
IndentWrappedFunctionNames: false
IndentRequiresClause: true
RequiresClausePosition: OwnLine
@@ -63,18 +63,18 @@ PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 200
PointerAlignment: Left
QualifierAlignment: Right
ReflowComments: true
SortIncludes: true
ReflowComments: true
SortIncludes: true
SpaceAfterCStyleCast: false
SpaceBeforeAssignmentOperators: true
SpaceBeforeParens: ControlStatements
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 2
SpacesInAngles: false
SpacesInAngles: false
SpacesInContainerLiterals: true
SpacesInCStyleCastParentheses: false
SpacesInParentheses: false
SpacesInSquareBrackets: false
Standard: Cpp11
TabWidth: 8
UseTab: Never
Standard: Cpp11
TabWidth: 8
UseTab: Never

View File

@@ -1,5 +1,5 @@
---
Checks: "-*,
Checks: '-*,
bugprone-argument-comment,
bugprone-assert-side-effect,
bugprone-bad-signal-to-kill-thread,
@@ -8,7 +8,6 @@ Checks: "-*,
bugprone-chained-comparison,
bugprone-compare-pointer-to-member-virtual-function,
bugprone-copy-constructor-init,
bugprone-crtp-constructor-accessibility,
bugprone-dangling-handle,
bugprone-dynamic-static-initializers,
bugprone-empty-catch,
@@ -34,11 +33,9 @@ Checks: "-*,
bugprone-non-zero-enum-to-bool-conversion,
bugprone-optional-value-conversion,
bugprone-parent-virtual-call,
bugprone-pointer-arithmetic-on-polymorphic-object,
bugprone-posix-return,
bugprone-redundant-branch-condition,
bugprone-reserved-identifier,
bugprone-return-const-ref-from-parameter,
bugprone-shared-ptr-array-mismatch,
bugprone-signal-handler,
bugprone-signed-char-misuse,
@@ -58,7 +55,6 @@ Checks: "-*,
bugprone-suspicious-realloc-usage,
bugprone-suspicious-semicolon,
bugprone-suspicious-string-compare,
bugprone-suspicious-stringview-data-usage,
bugprone-swapped-arguments,
bugprone-switch-missing-default-case,
bugprone-terminating-continue,
@@ -101,12 +97,10 @@ Checks: "-*,
modernize-make-unique,
modernize-pass-by-value,
modernize-type-traits,
modernize-use-designated-initializers,
modernize-use-emplace,
modernize-use-equals-default,
modernize-use-equals-delete,
modernize-use-override,
modernize-use-ranges,
modernize-use-starts-ends-with,
modernize-use-std-numbers,
modernize-use-using,
@@ -127,12 +121,9 @@ Checks: "-*,
readability-convert-member-functions-to-static,
readability-duplicate-include,
readability-else-after-return,
readability-enum-initial-value,
readability-implicit-bool-conversion,
readability-inconsistent-declaration-parameter-name,
readability-identifier-naming,
readability-make-member-function-const,
readability-math-missing-parentheses,
readability-misleading-indentation,
readability-non-const-parameter,
readability-redundant-casting,
@@ -144,48 +135,14 @@ Checks: "-*,
readability-simplify-boolean-expr,
readability-static-accessed-through-instance,
readability-static-definition-in-anonymous-namespace,
readability-suspicious-call-argument,
readability-use-std-min-max
"
readability-suspicious-call-argument
'
CheckOptions:
readability-braces-around-statements.ShortStatementLines: 2
readability-identifier-naming.MacroDefinitionCase: UPPER_CASE
readability-identifier-naming.ClassCase: CamelCase
readability-identifier-naming.StructCase: CamelCase
readability-identifier-naming.UnionCase: CamelCase
readability-identifier-naming.EnumCase: CamelCase
readability-identifier-naming.EnumConstantCase: CamelCase
readability-identifier-naming.ScopedEnumConstantCase: CamelCase
readability-identifier-naming.GlobalConstantCase: UPPER_CASE
readability-identifier-naming.GlobalConstantPrefix: "k"
readability-identifier-naming.GlobalVariableCase: CamelCase
readability-identifier-naming.GlobalVariablePrefix: "g"
readability-identifier-naming.ConstexprFunctionCase: camelBack
readability-identifier-naming.ConstexprMethodCase: camelBack
readability-identifier-naming.ClassMethodCase: camelBack
readability-identifier-naming.ClassMemberCase: camelBack
readability-identifier-naming.ClassConstantCase: UPPER_CASE
readability-identifier-naming.ClassConstantPrefix: "k"
readability-identifier-naming.StaticConstantCase: UPPER_CASE
readability-identifier-naming.StaticConstantPrefix: "k"
readability-identifier-naming.StaticVariableCase: UPPER_CASE
readability-identifier-naming.StaticVariablePrefix: "k"
readability-identifier-naming.ConstexprVariableCase: UPPER_CASE
readability-identifier-naming.ConstexprVariablePrefix: "k"
readability-identifier-naming.LocalConstantCase: camelBack
readability-identifier-naming.LocalVariableCase: camelBack
readability-identifier-naming.TemplateParameterCase: CamelCase
readability-identifier-naming.ParameterCase: camelBack
readability-identifier-naming.FunctionCase: camelBack
readability-identifier-naming.MemberCase: camelBack
readability-identifier-naming.PrivateMemberSuffix: _
readability-identifier-naming.ProtectedMemberSuffix: _
readability-identifier-naming.PublicMemberSuffix: ""
readability-identifier-naming.FunctionIgnoredRegexp: ".*tag_invoke.*"
bugprone-unsafe-functions.ReportMoreUnsafeFunctions: true
bugprone-unused-return-value.CheckedReturnTypes: ::std::error_code;::std::error_condition;::std::errc
misc-include-cleaner.IgnoreHeaders: '.*/(detail|impl)/.*;.*(expected|unexpected).*;.*ranges_lower_bound\.h;time.h;stdlib.h;__chrono/.*;fmt/chrono.h;boost/uuid/uuid_hash.hpp'
misc-include-cleaner.IgnoreHeaders: '.*/(detail|impl)/.*;.*(expected|unexpected).*;.*ranges_lower_bound\.h;time.h;stdlib.h'
HeaderFilterRegex: '^.*/(src|tests)/.*\.(h|hpp)$'
WarningsAsErrors: "*"
WarningsAsErrors: '*'

View File

@@ -1,222 +1,222 @@
_help_parse: Options affecting listfile parsing
parse:
_help_additional_commands:
- Specify structure for custom cmake functions
- Specify structure for custom cmake functions
additional_commands:
foo:
flags:
- BAR
- BAZ
- BAR
- BAZ
kwargs:
HEADERS: "*"
SOURCES: "*"
DEPENDS: "*"
HEADERS: '*'
SOURCES: '*'
DEPENDS: '*'
_help_override_spec:
- Override configurations per-command where available
- Override configurations per-command where available
override_spec: {}
_help_vartags:
- Specify variable tags.
- Specify variable tags.
vartags: []
_help_proptags:
- Specify property tags.
- Specify property tags.
proptags: []
_help_format: Options affecting formatting.
format:
_help_disable:
- Disable formatting entirely, making cmake-format a no-op
- Disable formatting entirely, making cmake-format a no-op
disable: false
_help_line_width:
- How wide to allow formatted cmake files
- How wide to allow formatted cmake files
line_width: 120
_help_tab_size:
- How many spaces to tab for indent
- How many spaces to tab for indent
tab_size: 2
_help_use_tabchars:
- If true, lines are indented using tab characters (utf-8
- 0x09) instead of <tab_size> space characters (utf-8 0x20).
- In cases where the layout would require a fractional tab
- character, the behavior of the fractional indentation is
- governed by <fractional_tab_policy>
- If true, lines are indented using tab characters (utf-8
- 0x09) instead of <tab_size> space characters (utf-8 0x20).
- In cases where the layout would require a fractional tab
- character, the behavior of the fractional indentation is
- governed by <fractional_tab_policy>
use_tabchars: false
_help_fractional_tab_policy:
- If <use_tabchars> is True, then the value of this variable
- indicates how fractional indentions are handled during
- whitespace replacement. If set to 'use-space', fractional
- indentation is left as spaces (utf-8 0x20). If set to
- "`round-up` fractional indentation is replaced with a single"
- tab character (utf-8 0x09) effectively shifting the column
- to the next tabstop
- If <use_tabchars> is True, then the value of this variable
- indicates how fractional indentions are handled during
- whitespace replacement. If set to 'use-space', fractional
- indentation is left as spaces (utf-8 0x20). If set to
- '`round-up` fractional indentation is replaced with a single'
- tab character (utf-8 0x09) effectively shifting the column
- to the next tabstop
fractional_tab_policy: use-space
_help_max_subgroups_hwrap:
- If an argument group contains more than this many sub-groups
- (parg or kwarg groups) then force it to a vertical layout.
- If an argument group contains more than this many sub-groups
- (parg or kwarg groups) then force it to a vertical layout.
max_subgroups_hwrap: 4
_help_max_pargs_hwrap:
- If a positional argument group contains more than this many
- arguments, then force it to a vertical layout.
- If a positional argument group contains more than this many
- arguments, then force it to a vertical layout.
max_pargs_hwrap: 6
_help_max_rows_cmdline:
- If a cmdline positional group consumes more than this many
- lines without nesting, then invalidate the layout (and nest)
- If a cmdline positional group consumes more than this many
- lines without nesting, then invalidate the layout (and nest)
max_rows_cmdline: 2
_help_separate_ctrl_name_with_space:
- If true, separate flow control names from their parentheses
- with a space
- If true, separate flow control names from their parentheses
- with a space
separate_ctrl_name_with_space: true
_help_separate_fn_name_with_space:
- If true, separate function names from parentheses with a
- space
- If true, separate function names from parentheses with a
- space
separate_fn_name_with_space: false
_help_dangle_parens:
- If a statement is wrapped to more than one line, than dangle
- the closing parenthesis on its own line.
- If a statement is wrapped to more than one line, than dangle
- the closing parenthesis on its own line.
dangle_parens: true
_help_dangle_align:
- If the trailing parenthesis must be 'dangled' on its on
- "line, then align it to this reference: `prefix`: the start"
- "of the statement, `prefix-indent`: the start of the"
- "statement, plus one indentation level, `child`: align to"
- the column of the arguments
- If the trailing parenthesis must be 'dangled' on its on
- 'line, then align it to this reference: `prefix`: the start'
- 'of the statement, `prefix-indent`: the start of the'
- 'statement, plus one indentation level, `child`: align to'
- the column of the arguments
dangle_align: prefix
_help_min_prefix_chars:
- If the statement spelling length (including space and
- parenthesis) is smaller than this amount, then force reject
- nested layouts.
- If the statement spelling length (including space and
- parenthesis) is smaller than this amount, then force reject
- nested layouts.
min_prefix_chars: 4
_help_max_prefix_chars:
- If the statement spelling length (including space and
- parenthesis) is larger than the tab width by more than this
- amount, then force reject un-nested layouts.
- If the statement spelling length (including space and
- parenthesis) is larger than the tab width by more than this
- amount, then force reject un-nested layouts.
max_prefix_chars: 10
_help_max_lines_hwrap:
- If a candidate layout is wrapped horizontally but it exceeds
- this many lines, then reject the layout.
- If a candidate layout is wrapped horizontally but it exceeds
- this many lines, then reject the layout.
max_lines_hwrap: 2
_help_line_ending:
- What style line endings to use in the output.
- What style line endings to use in the output.
line_ending: unix
_help_command_case:
- Format command names consistently as 'lower' or 'upper' case
- Format command names consistently as 'lower' or 'upper' case
command_case: canonical
_help_keyword_case:
- Format keywords consistently as 'lower' or 'upper' case
- Format keywords consistently as 'lower' or 'upper' case
keyword_case: unchanged
_help_always_wrap:
- A list of command names which should always be wrapped
- A list of command names which should always be wrapped
always_wrap: []
_help_enable_sort:
- If true, the argument lists which are known to be sortable
- will be sorted lexicographicall
- If true, the argument lists which are known to be sortable
- will be sorted lexicographicall
enable_sort: true
_help_autosort:
- If true, the parsers may infer whether or not an argument
- list is sortable (without annotation).
- If true, the parsers may infer whether or not an argument
- list is sortable (without annotation).
autosort: true
_help_require_valid_layout:
- By default, if cmake-format cannot successfully fit
- everything into the desired linewidth it will apply the
- last, most aggressive attempt that it made. If this flag is
- True, however, cmake-format will print error, exit with non-
- zero status code, and write-out nothing
- By default, if cmake-format cannot successfully fit
- everything into the desired linewidth it will apply the
- last, most agressive attempt that it made. If this flag is
- True, however, cmake-format will print error, exit with non-
- zero status code, and write-out nothing
require_valid_layout: false
_help_layout_passes:
- A dictionary mapping layout nodes to a list of wrap
- decisions. See the documentation for more information.
- A dictionary mapping layout nodes to a list of wrap
- decisions. See the documentation for more information.
layout_passes: {}
_help_markup: Options affecting comment reflow and formatting.
markup:
_help_bullet_char:
- What character to use for bulleted lists
bullet_char: "*"
- What character to use for bulleted lists
bullet_char: '*'
_help_enum_char:
- What character to use as punctuation after numerals in an
- enumerated list
- What character to use as punctuation after numerals in an
- enumerated list
enum_char: .
_help_first_comment_is_literal:
- If comment markup is enabled, don't reflow the first comment
- block in each listfile. Use this to preserve formatting of
- your copyright/license statements.
- If comment markup is enabled, don't reflow the first comment
- block in each listfile. Use this to preserve formatting of
- your copyright/license statements.
first_comment_is_literal: false
_help_literal_comment_pattern:
- If comment markup is enabled, don't reflow any comment block
- which matches this (regex) pattern. Default is `None`
- (disabled).
- If comment markup is enabled, don't reflow any comment block
- which matches this (regex) pattern. Default is `None`
- (disabled).
literal_comment_pattern: null
_help_fence_pattern:
- Regular expression to match preformat fences in comments
- default= ``r'^\s*([`~]{3}[`~]*)(.*)$'``
- Regular expression to match preformat fences in comments
- default= ``r'^\s*([`~]{3}[`~]*)(.*)$'``
fence_pattern: ^\s*([`~]{3}[`~]*)(.*)$
_help_ruler_pattern:
- Regular expression to match rulers in comments default=
- '``r''^\s*[^\w\s]{3}.*[^\w\s]{3}$''``'
- Regular expression to match rulers in comments default=
- '``r''^\s*[^\w\s]{3}.*[^\w\s]{3}$''``'
ruler_pattern: ^\s*[^\w\s]{3}.*[^\w\s]{3}$
_help_explicit_trailing_pattern:
- If a comment line matches starts with this pattern then it
- is explicitly a trailing comment for the preceding
- argument. Default is '#<'
explicit_trailing_pattern: "#<"
- If a comment line matches starts with this pattern then it
- is explicitly a trailing comment for the preceeding
- argument. Default is '#<'
explicit_trailing_pattern: '#<'
_help_hashruler_min_length:
- If a comment line starts with at least this many consecutive
- hash characters, then don't lstrip() them off. This allows
- for lazy hash rulers where the first hash char is not
- separated by space
- If a comment line starts with at least this many consecutive
- hash characters, then don't lstrip() them off. This allows
- for lazy hash rulers where the first hash char is not
- separated by space
hashruler_min_length: 10
_help_canonicalize_hashrulers:
- If true, then insert a space between the first hash char and
- remaining hash chars in a hash ruler, and normalize its
- length to fill the column
- If true, then insert a space between the first hash char and
- remaining hash chars in a hash ruler, and normalize its
- length to fill the column
canonicalize_hashrulers: true
_help_enable_markup:
- enable comment markup parsing and reflow
- enable comment markup parsing and reflow
enable_markup: true
_help_lint: Options affecting the linter
lint:
_help_disabled_codes:
- a list of lint codes to disable
- a list of lint codes to disable
disabled_codes: []
_help_function_pattern:
- regular expression pattern describing valid function names
function_pattern: "[0-9a-z_]+"
- regular expression pattern describing valid function names
function_pattern: '[0-9a-z_]+'
_help_macro_pattern:
- regular expression pattern describing valid macro names
macro_pattern: "[0-9A-Z_]+"
- regular expression pattern describing valid macro names
macro_pattern: '[0-9A-Z_]+'
_help_global_var_pattern:
- regular expression pattern describing valid names for
- variables with global (cache) scope
global_var_pattern: "[A-Z][0-9A-Z_]+"
- regular expression pattern describing valid names for
- variables with global (cache) scope
global_var_pattern: '[A-Z][0-9A-Z_]+'
_help_internal_var_pattern:
- regular expression pattern describing valid names for
- variables with global scope (but internal semantic)
- regular expression pattern describing valid names for
- variables with global scope (but internal semantic)
internal_var_pattern: _[A-Z][0-9A-Z_]+
_help_local_var_pattern:
- regular expression pattern describing valid names for
- variables with local scope
local_var_pattern: "[a-z][a-z0-9_]+"
- regular expression pattern describing valid names for
- variables with local scope
local_var_pattern: '[a-z][a-z0-9_]+'
_help_private_var_pattern:
- regular expression pattern describing valid names for
- privatedirectory variables
- regular expression pattern describing valid names for
- privatedirectory variables
private_var_pattern: _[0-9a-z_]+
_help_public_var_pattern:
- regular expression pattern describing valid names for public
- directory variables
public_var_pattern: "[A-Z][0-9A-Z_]+"
- regular expression pattern describing valid names for public
- directory variables
public_var_pattern: '[A-Z][0-9A-Z_]+'
_help_argument_var_pattern:
- regular expression pattern describing valid names for
- function/macro arguments and loop variables.
argument_var_pattern: "[a-z][a-z0-9_]+"
- regular expression pattern describing valid names for
- function/macro arguments and loop variables.
argument_var_pattern: '[a-z][a-z0-9_]+'
_help_keyword_pattern:
- regular expression pattern describing valid names for
- keywords used in functions or macros
keyword_pattern: "[A-Z][0-9A-Z_]+"
- regular expression pattern describing valid names for
- keywords used in functions or macros
keyword_pattern: '[A-Z][0-9A-Z_]+'
_help_max_conditionals_custom_parser:
- In the heuristic for C0201, how many conditionals to match
- within a loop in before considering the loop a parser.
- In the heuristic for C0201, how many conditionals to match
- within a loop in before considering the loop a parser.
max_conditionals_custom_parser: 2
_help_min_statement_spacing:
- Require at least this many newlines between statements
- Require at least this many newlines between statements
min_statement_spacing: 1
_help_max_statement_spacing:
- Require no more than this many newlines between statements
- Require no more than this many newlines between statements
max_statement_spacing: 2
max_returns: 6
max_branches: 12
@@ -226,20 +226,20 @@ lint:
_help_encode: Options affecting file encoding
encode:
_help_emit_byteorder_mark:
- If true, emit the unicode byte-order mark (BOM) at the start
- of the file
- If true, emit the unicode byte-order mark (BOM) at the start
- of the file
emit_byteorder_mark: false
_help_input_encoding:
- Specify the encoding of the input file. Defaults to utf-8
- Specify the encoding of the input file. Defaults to utf-8
input_encoding: utf-8
_help_output_encoding:
- Specify the encoding of the output file. Defaults to utf-8.
- Note that cmake only claims to support utf-8 so be careful
- when using anything else
- Specify the encoding of the output file. Defaults to utf-8.
- Note that cmake only claims to support utf-8 so be careful
- when using anything else
output_encoding: utf-8
_help_misc: Miscellaneous configurations options.
misc:
_help_per_command:
- A dictionary containing any per-command configuration
- overrides. Currently only `command_case` is supported.
- A dictionary containing any per-command configuration
- overrides. Currently only `command_case` is supported.
per_command: {}

View File

@@ -9,14 +9,3 @@ coverage:
default:
target: 20% # Need to bump this number https://docs.codecov.com/docs/commit-status#patch-status
threshold: 2%
# `codecov/codecov-action` reruns `gcovr` if build files present
# That's why we run it in a separate workflow
# This ignore list is not currently used
#
# More info: https://github.com/XRPLF/clio/pull/2066
ignore:
- "tests"
- "src/data/cassandra/"
- "src/data/CassandraBackend.hpp"
- "src/data/BackendFactory.*"

View File

@@ -2,7 +2,7 @@
# Note: This script is intended to be run from the root of the repository.
#
# Not really a hook but should be used to check the completeness of documentation for added code, otherwise CI will come for you.
# Not really a hook but should be used to check the completness of documentation for added code, otherwise CI will come for you.
# It's good to have /tmp as the output so that consecutive runs are fast but no clutter in the repository.
echo "+ Checking documentation..."
@@ -15,18 +15,12 @@ DOCDIR=${TMPDIR}/out
# Check doxygen is at all installed
if [ -z "$DOXYGEN" ]; then
if [[ "${CI}" == "true" ]]; then
# If we are in CI, we should fail the check
echo "doxygen not found in CI, please install it"
exit 1
fi
# No hard error if doxygen is not installed yet
cat <<EOF
WARNING
-----------------------------------------------------------------------------
'doxygen' is required to check documentation.
'doxygen' is required to check documentation.
Please install it for next time.
Your changes may fail to pass CI once pushed.

119
.githooks/check-format Executable file
View File

@@ -0,0 +1,119 @@
#!/bin/bash
# Note: This script is intended to be run from the root of the repository.
#
# This script checks the format of the code and cmake files.
# In many cases it will automatically fix the issues and abort the commit.
no_formatted_directories_staged() {
staged_directories=$(git diff-index --cached --name-only HEAD | awk -F/ '{print $1}')
for sd in $staged_directories; do
if [[ "$sd" =~ ^(benchmark|cmake|src|tests)$ ]]; then
return 1
fi
done
return 0
}
if no_formatted_directories_staged ; then
exit 0
fi
echo "+ Checking code format..."
# paths to check and re-format
sources="src tests"
formatter="clang-format -i"
version=$($formatter --version | grep -o '[0-9\.]*')
if [[ "19.0.0" > "$version" ]]; then
cat <<EOF
ERROR
-----------------------------------------------------------------------------
A minimum of version 19 of `which clang-format` is required.
Your version is $version.
Please fix paths and run again.
-----------------------------------------------------------------------------
EOF
exit 3
fi
# check there is no .h headers, only .hpp
wrong_headers=$(find $sources -name "*.h" | sed 's/^/ - /')
if [[ ! -z "$wrong_headers" ]]; then
cat <<EOF
ERROR
-----------------------------------------------------------------------------
Found .h headers in the source code. Please rename them to .hpp:
$wrong_headers
-----------------------------------------------------------------------------
EOF
exit 2
fi
if ! command -v cmake-format &> /dev/null; then
cat <<EOF
ERROR
-----------------------------------------------------------------------------
'cmake-format' is required to run this script.
Please install it and run again.
-----------------------------------------------------------------------------
EOF
exit 3
fi
function grep_code {
grep -l "${1}" ${sources} -r --include \*.hpp --include \*.cpp
}
GNU_SED=$(sed --version 2>&1 | grep -q 'GNU' && echo true || echo false)
if [[ "$GNU_SED" == "false" ]]; then # macOS sed
# make all includes to be <...> style
grep_code '#include ".*"' | xargs sed -i '' -E 's|#include "(.*)"|#include <\1>|g'
# make local includes to be "..." style
main_src_dirs=$(find ./src -maxdepth 1 -type d -exec basename {} \; | tr '\n' '|' | sed 's/|$//' | sed 's/|/\\|/g')
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i '' -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
else
# make all includes to be <...> style
grep_code '#include ".*"' | xargs sed -i -E 's|#include "(.*)"|#include <\1>|g'
# make local includes to be "..." style
main_src_dirs=$(find ./src -maxdepth 1 -type d -exec basename {} \; | paste -sd '|' | sed 's/|/\\|/g')
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
fi
cmake_dirs=$(echo cmake $sources)
cmake_files=$(find $cmake_dirs -type f \( -name "CMakeLists.txt" -o -name "*.cmake" \))
cmake_files=$(echo $cmake_files ./CMakeLists.txt)
first=$(git diff $sources $cmake_files)
find $sources -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.ipp' \) -print0 | xargs -0 $formatter
cmake-format -i $cmake_files
second=$(git diff $sources $cmake_files)
changes=$(diff <(echo "$first") <(echo "$second"))
changes_number=$(echo -n "$changes" | wc -l | sed -e 's/^[[:space:]]*//')
if [ "$changes_number" != "0" ]; then
cat <<\EOF
WARNING
-----------------------------------------------------------------------------
Automatically re-formatted code with 'clang-format' - commit was aborted.
Please manually add any updated files and commit again.
-----------------------------------------------------------------------------
EOF
if [[ "$1" == "--diff" ]]; then
echo "$changes"
fi
exit 1
fi

7
.githooks/pre-commit Executable file
View File

@@ -0,0 +1,7 @@
#!/bin/bash
# This script is intended to be run from the root of the repository.
source .githooks/check-format
source .githooks/check-docs

View File

@@ -52,3 +52,7 @@ while read local_ref local_oid remote_ref remote_oid; do
fi
fi
done
command -v git-lfs >/dev/null 2>&1 || { echo >&2 "\nThis repository is configured for Git LFS but 'git-lfs' was not found on your path. If you no longer wish to use Git LFS, remove this hook by deleting the 'pre-push' file in the hooks directory (set by 'core.hookspath'; usually '.git/hooks').\n"; exit 2; }
git lfs pre-push "$@"

View File

@@ -3,34 +3,29 @@ name: Bug report
about: Create a report to help us improve
title: "[Title with short description] (Version: [Clio version])"
labels: bug
assignees: ""
assignees: ''
---
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Issue Description
<!-- Provide a summary for your issue/bug. -->
## Steps to Reproduce
<!-- List in detail the exact steps to reproduce the unexpected behavior of the software. -->
## Expected Result
<!-- Explain in detail what behavior you expected to happen. -->
## Actual Result
<!-- Explain in detail what behavior actually happened. -->
## Environment
<!-- Please describe your environment setup (such as Ubuntu 20.04.2 with Boost 1.82). -->
<!-- Please use the version returned by './clio_server --version' as the version number -->
## Supporting Files
<!-- If you have supporting files such as a log, feel free to post a link here using Github Gist. -->
<!-- Consider adding configuration files with private information removed via Github Gist. -->

View File

@@ -3,24 +3,21 @@ name: Feature request
about: Suggest an idea for this project
title: "[Title with short description] (Version: [Clio version])"
labels: enhancement
assignees: ""
assignees: ''
---
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Summary
<!-- Provide a summary to the feature request -->
## Motivation
<!-- Why do we need this feature? -->
## Solution
<!-- What is the solution? -->
## Paths Not Taken
<!-- What other alternatives have been considered? -->

View File

@@ -3,7 +3,8 @@ name: Question
about: A question in form of an issue
title: "[Title with short description] (Version: Clio version)"
labels: question
assignees: ""
assignees: ''
---
<!-- Please search existing issues to avoid creating duplicates. -->
@@ -11,9 +12,7 @@ assignees: ""
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Question
<!-- Your question -->
## Paths Not Taken
<!-- If applicable, what other alternatives have been considered? -->

View File

@@ -1,29 +1,18 @@
name: Build clio
description: Build clio in build directory
inputs:
targets:
description: Space-separated build target names
target:
description: Build target name
default: all
subtract_threads:
description: An option for the action get_number_of_threads. See get_number_of_threads
required: true
default: "0"
runs:
using: composite
steps:
- name: Get number of threads
uses: ./.github/actions/get_number_of_threads
id: number_of_threads
with:
subtract_threads: ${{ inputs.subtract_threads }}
- name: Build targets
- name: Build Clio
shell: bash
run: |
cd build
cmake \
--build . \
--parallel ${{ steps.number_of_threads.outputs.threads_number }} \
--target ${{ inputs.targets }}
cmake --build . --parallel ${{ steps.number_of_threads.outputs.threads_number }} --target ${{ inputs.target }}

View File

@@ -1,12 +1,8 @@
name: Build and push Docker image
description: Build and push Docker image to DockerHub and GitHub Container Registry
inputs:
images:
description: Name of the images to use as a base name
required: true
dockerhub_repo:
description: DockerHub repository name
image_name:
description: Name of the image to build
required: true
push_image:
description: Whether to push the image to the registry (true/false)
@@ -23,50 +19,48 @@ inputs:
description:
description: Short description of the image
required: true
runs:
using: composite
steps:
- name: Login to DockerHub
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
- name: Login to DockerHub
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@v3
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
- name: Login to GitHub Container Registry
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ env.GITHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ env.GITHUB_TOKEN }}
- uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v3.6.0
with:
cache-image: false
- uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- uses: docker/setup-qemu-action@v3
- uses: docker/setup-buildx-action@v3
- uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
id: meta
with:
images: ${{ inputs.images }}
tags: ${{ inputs.tags }}
- uses: docker/metadata-action@v5
id: meta
with:
images: ${{ inputs.image_name }}
tags: ${{ inputs.tags }}
- name: Build and push
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ inputs.directory }}
platforms: ${{ inputs.platforms }}
push: ${{ inputs.push_image == 'true' }}
tags: ${{ steps.meta.outputs.tags }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: ${{ inputs.directory }}
platforms: ${{ inputs.platforms }}
push: ${{ inputs.push_image == 'true' }}
tags: ${{ steps.meta.outputs.tags }}
- name: Update DockerHub description
if: ${{ inputs.push_image == 'true' }}
uses: peter-evans/dockerhub-description@v4
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
repository: ${{ inputs.image_name }}
short-description: ${{ inputs.description }}
readme-filepath: ${{ inputs.directory }}/README.md
- name: Update DockerHub description
if: ${{ inputs.push_image == 'true' }}
uses: peter-evans/dockerhub-description@432a30c9e07499fd01da9f8a49f0faf9e0ca5b77 # v4.0.2
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
repository: ${{ inputs.dockerhub_repo }}
short-description: ${{ inputs.description }}
readme-filepath: ${{ inputs.directory }}/README.md

View File

@@ -1,26 +1,21 @@
name: Generate code coverage report
description: Run tests, generate code coverage report and upload it to codecov.io
runs:
using: composite
steps:
- name: Run tests
shell: bash
run: |
build/clio_tests
# Please keep exclude list in sync with .codecov.yml
- name: Run gcovr
shell: bash
run: |
gcovr \
-e tests \
gcovr -e tests \
-e src/data/cassandra \
-e src/data/CassandraBackend.hpp \
-e 'src/data/BackendFactory.*' \
--xml build/coverage_report.xml \
-j8 --exclude-throw-branches
--xml build/coverage_report.xml -j8 --exclude-throw-branches
- name: Archive coverage report
uses: actions/upload-artifact@v4

View File

@@ -1,6 +1,5 @@
name: Create an issue
description: Create an issue
inputs:
title:
description: Issue title
@@ -11,31 +10,26 @@ inputs:
labels:
description: Comma-separated list of labels
required: true
default: "bug"
default: 'bug'
assignees:
description: Comma-separated list of assignees
required: true
default: "godexsoft,kuznetsss,PeterChen13579"
default: 'cindyyan317,godexsoft,kuznetsss'
outputs:
created_issue_id:
description: Created issue id
value: ${{ steps.create_issue.outputs.created_issue }}
runs:
using: composite
steps:
- name: Create an issue
id: create_issue
shell: bash
run: |
echo -e '${{ inputs.body }}' > issue.md
gh issue create \
--assignee '${{ inputs.assignees }}' \
--label '${{ inputs.labels }}' \
--title '${{ inputs.title }}' \
--body-file ./issue.md \
> create_issue.log
created_issue=$(cat create_issue.log | sed 's|.*/||')
echo "created_issue=$created_issue" >> $GITHUB_OUTPUT
rm create_issue.log issue.md
- name: Create an issue
id: create_issue
shell: bash
run: |
echo -e '${{ inputs.body }}' > issue.md
gh issue create --assignee '${{ inputs.assignees }}' --label '${{ inputs.labels }}' --title '${{ inputs.title }}' --body-file ./issue.md > create_issue.log
created_issue=$(cat create_issue.log | sed 's|.*/||')
echo "created_issue=$created_issue" >> $GITHUB_OUTPUT
rm create_issue.log issue.md

View File

@@ -1,6 +1,5 @@
name: Run conan and cmake
description: Run conan and cmake
inputs:
conan_profile:
description: Conan profile name
@@ -8,37 +7,19 @@ inputs:
conan_cache_hit:
description: Whether conan cache has been downloaded
required: true
default: "false"
default: 'false'
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
build_integration_tests:
description: Whether to build integration tests
required: true
default: "true"
default: 'Release'
code_coverage:
description: Whether conan's coverage option should be on or not
required: true
default: "false"
default: 'false'
static:
description: Whether Clio is to be statically linked
required: true
default: "false"
sanitizer:
description: Sanitizer to use
required: true
default: "false"
choices:
- "false"
- "tsan"
- "asan"
- "ubsan"
time_trace:
description: Whether to enable compiler trace reports
required: true
default: "false"
default: 'false'
runs:
using: composite
steps:
@@ -52,37 +33,14 @@ runs:
BUILD_OPTION: "${{ inputs.conan_cache_hit == 'true' && 'missing' || '' }}"
CODE_COVERAGE: "${{ inputs.code_coverage == 'true' && 'True' || 'False' }}"
STATIC_OPTION: "${{ inputs.static == 'true' && 'True' || 'False' }}"
INTEGRATION_TESTS_OPTION: "${{ inputs.build_integration_tests == 'true' && 'True' || 'False' }}"
TIME_TRACE: "${{ inputs.time_trace == 'true' && 'True' || 'False' }}"
run: |
cd build
conan \
install .. \
-of . \
-b $BUILD_OPTION \
-s build_type=${{ inputs.build_type }} \
-o clio:static="${STATIC_OPTION}" \
-o clio:tests=True \
-o clio:integration_tests="${INTEGRATION_TESTS_OPTION}" \
-o clio:lint=False \
-o clio:coverage="${CODE_COVERAGE}" \
-o clio:time_trace="${TIME_TRACE}" \
--profile ${{ inputs.conan_profile }}
conan install .. -of . -b $BUILD_OPTION -s build_type=${{ inputs.build_type }} -o clio:static="${STATIC_OPTION}" -o clio:tests=True -o clio:integration_tests=True -o clio:lint=False -o clio:coverage="${CODE_COVERAGE}" --profile ${{ inputs.conan_profile }}
- name: Run cmake
shell: bash
env:
BUILD_TYPE: "${{ inputs.build_type }}"
SANITIZER_OPTION: |
${{ inputs.sanitizer == 'tsan' && '-Dsan=thread' ||
inputs.sanitizer == 'ubsan' && '-Dsan=undefined' ||
inputs.sanitizer == 'asan' && '-Dsan=address' ||
'' }}
run: |
cd build
cmake \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
${SANITIZER_OPTION} \
.. \
-G Ninja
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=${{ inputs.build_type }} ${{ inputs.extra_cmake_args }} .. -G Ninja

View File

@@ -1,16 +1,9 @@
name: Get number of threads
description: Determines number of threads to use on macOS and Linux
inputs:
subtract_threads:
description: How many threads to subtract from the calculated number
required: true
default: "0"
outputs:
threads_number:
description: Number of threads to use
value: ${{ steps.number_of_threads_export.outputs.num }}
runs:
using: composite
steps:
@@ -26,11 +19,8 @@ runs:
shell: bash
run: echo "num=$(($(nproc) - 2))" >> $GITHUB_OUTPUT
- name: Shift and export number of threads
id: number_of_threads_export
- name: Export output variable
shell: bash
id: number_of_threads_export
run: |
num_of_threads=${{ steps.mac_threads.outputs.num || steps.linux_threads.outputs.num }}
shift_by=${{ inputs.subtract_threads }}
shifted=$((num_of_threads - shift_by))
echo "num=$(( shifted > 1 ? shifted : 1 ))" >> $GITHUB_OUTPUT
echo "num=${{ steps.mac_threads.outputs.num || steps.linux_threads.outputs.num }}" >> $GITHUB_OUTPUT

View File

@@ -1,11 +1,9 @@
name: Git common ancestor
description: Find the closest common commit
outputs:
commit:
description: Hash of commit
value: ${{ steps.find_common_ancestor.outputs.commit }}
runs:
using: composite
steps:

View File

@@ -1,11 +1,9 @@
name: Prepare runner
description: Install packages, set environment variables, create directories
inputs:
disable_ccache:
description: Whether ccache should be disabled
required: true
runs:
using: composite
steps:
@@ -13,37 +11,8 @@ runs:
if: ${{ runner.os == 'macOS' }}
shell: bash
run: |
brew install \
bison \
ca-certificates \
ccache \
clang-build-analyzer \
conan@1 \
gh \
jq \
llvm@14 \
ninja \
pkg-config
echo "/opt/homebrew/opt/conan@1/bin" >> $GITHUB_PATH
- name: Install CMake 3.31.6 on mac
if: ${{ runner.os == 'macOS' }}
shell: bash
run: |
# Uninstall any existing cmake
brew uninstall cmake --ignore-dependencies || true
# Download specific cmake formula
FORMULA_URL="https://raw.githubusercontent.com/Homebrew/homebrew-core/b4e46db74e74a8c1650b38b1da222284ce1ec5ce/Formula/c/cmake.rb"
FORMULA_EXPECTED_SHA256="c7ec95d86f0657638835441871e77541165e0a2581b53b3dd657cf13ad4228d4"
mkdir -p /tmp/homebrew-formula
curl -s -L $FORMULA_URL -o /tmp/homebrew-formula/cmake.rb
echo "$FORMULA_EXPECTED_SHA256 /tmp/homebrew-formula/cmake.rb" | shasum -a 256 -c
# Install cmake from the specific formula with force flag
brew install --formula --force /tmp/homebrew-formula/cmake.rb
brew install llvm@14 pkg-config ninja bison cmake ccache jq gh conan@1
echo "/opt/homebrew/opt/conan@1/bin" >> $GITHUB_PATH
- name: Fix git permissions on Linux
if: ${{ runner.os == 'Linux' }}
@@ -75,3 +44,5 @@ runs:
run: |
mkdir -p $CCACHE_DIR
mkdir -p $CONAN_USER_HOME/.conan

View File

@@ -1,6 +1,5 @@
name: Restore cache
description: Find and restores conan and ccache cache
inputs:
conan_dir:
description: Path to .conan directory
@@ -18,7 +17,7 @@ inputs:
code_coverage:
description: Whether code coverage is on
required: true
default: "false"
default: 'false'
outputs:
conan_hash:
description: Hash to use as a part of conan cache key
@@ -29,7 +28,6 @@ outputs:
ccache_cache_hit:
description: True if ccache cache has been downloaded
value: ${{ steps.ccache_cache.outputs.cache-hit }}
runs:
using: composite
steps:

View File

@@ -1,6 +1,5 @@
name: Save cache
description: Save conan and ccache cache for develop branch
inputs:
conan_dir:
description: Path to .conan directory
@@ -29,8 +28,7 @@ inputs:
code_coverage:
description: Whether code coverage is on
required: true
default: "false"
default: 'false'
runs:
using: composite
steps:
@@ -57,3 +55,5 @@ runs:
with:
path: ${{ inputs.ccache_dir }}
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-${{ inputs.conan_profile }}-develop-${{ steps.git_common_ancestor.outputs.commit }}

View File

@@ -1,26 +1,43 @@
name: Setup conan
description: Setup conan profile and artifactory
inputs:
conan_profile:
description: Conan profile name
required: true
outputs:
conan_profile:
description: Created conan profile name
value: ${{ steps.conan_export_output.outputs.conan_profile }}
runs:
using: composite
steps:
- name: Create conan profile on macOS
- name: On mac
if: ${{ runner.os == 'macOS' }}
shell: bash
env:
CONAN_PROFILE: ${{ inputs.conan_profile }}
CONAN_PROFILE: apple_clang_15
id: conan_setup_mac
run: |
echo "Creating $CONAN_PROFILE conan profile"
echo "Creating $CONAN_PROFILE conan profile";
conan profile new $CONAN_PROFILE --detect --force
conan profile update settings.compiler.libcxx=libc++ $CONAN_PROFILE
conan profile update settings.compiler.cppstd=20 $CONAN_PROFILE
conan profile update env.CXXFLAGS=-DBOOST_ASIO_DISABLE_CONCEPTS $CONAN_PROFILE
conan profile update "conf.tools.build:cxxflags+=[\"-DBOOST_ASIO_DISABLE_CONCEPTS\"]" $CONAN_PROFILE
echo "created_conan_profile=$CONAN_PROFILE" >> $GITHUB_OUTPUT
- name: On linux
if: ${{ runner.os == 'Linux' }}
shell: bash
id: conan_setup_linux
run: |
echo "created_conan_profile=${{ inputs.conan_profile }}" >> $GITHUB_OUTPUT
- name: Export output variable
shell: bash
id: conan_export_output
run: |
echo "conan_profile=${{ steps.conan_setup_mac.outputs.created_conan_profile || steps.conan_setup_linux.outputs.created_conan_profile }}" >> $GITHUB_OUTPUT
- name: Add conan-non-prod artifactory
shell: bash
@@ -31,3 +48,5 @@ runs:
else
echo "Conan-non-prod is available"
fi

6
.github/actions/test/Dockerfile vendored Normal file
View File

@@ -0,0 +1,6 @@
FROM cassandra:4.0.4
RUN apt-get update && apt-get install -y postgresql
COPY entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

8
.github/actions/test/entrypoint.sh vendored Executable file
View File

@@ -0,0 +1,8 @@
#!/bin/bash
pg_ctlcluster 12 main start
su postgres -c"psql -c\"alter user postgres with password 'postgres'\""
su cassandra -c "/opt/cassandra/bin/cassandra -R"
sleep 90
chmod +x ./clio_tests
./clio_tests

161
.github/dependabot.yml vendored
View File

@@ -1,157 +1,16 @@
version: 2
updates:
- package-ecosystem: github-actions
directory: /
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: weekly
day: monday
interval: "weekly"
day: "monday"
time: "04:00"
timezone: Etc/GMT
timezone: "Etc/GMT"
reviewers:
- XRPLF/clio-dev-team
- "cindyyan317"
- "godexsoft"
- "kuznetsss"
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build_clio/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build_docker_image/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/code_coverage/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/create_issue/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/generate/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/get_number_of_threads/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/git_common_ancestor/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/prepare_runner/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/restore_cache/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/save_cache/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/setup_conan/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
prefix: "[CI] "
target-branch: "develop"

View File

@@ -1,45 +0,0 @@
#!/bin/bash
set -o pipefail
# Note: This script is intended to be run from the root of the repository.
#
# This script runs each unit-test separately and generates reports from the currently active sanitizer.
# Output is saved in ./.sanitizer-report in the root of the repository
if [[ -z "$1" ]]; then
cat <<EOF
ERROR
-----------------------------------------------------------------------------
Path to clio_tests should be passed as first argument to the script.
-----------------------------------------------------------------------------
EOF
exit 1
fi
TEST_BINARY=$1
if [[ ! -f "$TEST_BINARY" ]]; then
echo "Test binary not found: $TEST_BINARY"
exit 1
fi
TESTS=$($TEST_BINARY --gtest_list_tests | awk '/^ / {print suite $1} !/^ / {suite=$1}')
OUTPUT_DIR="./.sanitizer-report"
mkdir -p "$OUTPUT_DIR"
for TEST in $TESTS; do
OUTPUT_FILE="$OUTPUT_DIR/${TEST//\//_}"
export TSAN_OPTIONS="log_path=\"$OUTPUT_FILE\" die_after_fork=0"
export ASAN_OPTIONS="log_path=\"$OUTPUT_FILE\""
export UBSAN_OPTIONS="log_path=\"$OUTPUT_FILE\""
export MallocNanoZone='0' # for MacOSX
$TEST_BINARY --gtest_filter="$TEST" > /dev/null 2>&1
if [ $? -ne 0 ]; then
echo "'$TEST' failed a sanitizer check."
fi
done

View File

@@ -1,5 +1,4 @@
name: Build
on:
push:
branches: [master, release/*, develop]
@@ -7,96 +6,213 @@ on:
branches: [master, release/*, develop]
workflow_dispatch:
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
check_format:
name: Check format
runs-on: ubuntu-20.04
container:
image: rippleci/clio_ci:latest
steps:
- name: Fix git permissions on Linux
shell: bash
run: git config --global --add safe.directory $PWD
- uses: actions/checkout@v4
- name: Run formatters
id: run_formatters
run: |
./.githooks/check-format --diff
shell: bash
check_docs:
name: Check documentation
runs-on: ubuntu-20.04
container:
image: rippleci/clio_ci:latest
steps:
- uses: actions/checkout@v4
- name: Run linter
id: run_linter
run: |
./.githooks/check-docs
shell: bash
build:
name: Build
needs:
- check_format
- check_docs
strategy:
fail-fast: false
matrix:
os: [heavy]
conan_profile: [gcc, clang]
build_type: [Release, Debug]
container: ['{ "image": "ghcr.io/xrplf/clio-ci:latest" }']
static: [true]
include:
- os: macos15
conan_profile: default_apple_clang
- os: heavy
container:
image: rippleci/clio_ci:latest
build_type: Release
container: ""
conan_profile: gcc
code_coverage: false
static: true
- os: heavy
container:
image: rippleci/clio_ci:latest
build_type: Debug
conan_profile: gcc
code_coverage: true
static: true
- os: heavy
container:
image: rippleci/clio_ci:latest
build_type: Release
conan_profile: clang
code_coverage: false
static: true
- os: heavy
container:
image: rippleci/clio_ci:latest
build_type: Debug
conan_profile: clang
code_coverage: false
static: true
- os: macos14
build_type: Release
code_coverage: false
static: false
runs-on: [self-hosted, "${{ matrix.os }}"]
container: ${{ matrix.container }}
uses: ./.github/workflows/build_and_test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: false
upload_clio_server: true
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@1.0
code_coverage:
name: Run Code Coverage
- uses: actions/checkout@v4
with:
fetch-depth: 0
uses: ./.github/workflows/build_impl.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:latest" }'
conan_profile: gcc
build_type: Debug
disable_cache: false
code_coverage: true
static: true
upload_clio_server: false
targets: all
sanitizer: "false"
analyze_build_time: false
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: false
- name: Setup conan
uses: ./.github/actions/setup_conan
id: conan
with:
conan_profile: ${{ matrix.conan_profile }}
- name: Restore cache
uses: ./.github/actions/restore_cache
id: restore_cache
with:
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
conan_profile: ${{ steps.conan.outputs.conan_profile }}
ccache_dir: ${{ env.CCACHE_DIR }}
build_type: ${{ matrix.build_type }}
code_coverage: ${{ matrix.code_coverage }}
- name: Run conan and cmake
uses: ./.github/actions/generate
with:
conan_profile: ${{ steps.conan.outputs.conan_profile }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
build_type: ${{ matrix.build_type }}
code_coverage: ${{ matrix.code_coverage }}
static: ${{ matrix.static }}
- name: Build Clio
uses: ./.github/actions/build_clio
- name: Show ccache's statistics
shell: bash
id: ccache_stats
run: |
ccache -s > /tmp/ccache.stats
miss_rate=$(cat /tmp/ccache.stats | grep 'Misses' | head -n1 | sed 's/.*(\(.*\)%).*/\1/')
echo "miss_rate=${miss_rate}" >> $GITHUB_OUTPUT
cat /tmp/ccache.stats
- name: Strip tests
if: ${{ !matrix.code_coverage }}
run: strip build/clio_tests && strip build/clio_integration_tests
- name: Upload clio_server
uses: actions/upload-artifact@v4
with:
name: clio_server_${{ runner.os }}_${{ matrix.build_type }}_${{ steps.conan.outputs.conan_profile }}
path: build/clio_server
- name: Upload clio_tests
if: ${{ !matrix.code_coverage }}
uses: actions/upload-artifact@v4
with:
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}_${{ steps.conan.outputs.conan_profile }}
path: build/clio_*tests
- name: Save cache
uses: ./.github/actions/save_cache
with:
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
conan_hash: ${{ steps.restore_cache.outputs.conan_hash }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
ccache_dir: ${{ env.CCACHE_DIR }}
ccache_cache_hit: ${{ steps.restore_cache.outputs.ccache_cache_hit }}
ccache_cache_miss_rate: ${{ steps.ccache_stats.outputs.miss_rate }}
build_type: ${{ matrix.build_type }}
code_coverage: ${{ matrix.code_coverage }}
conan_profile: ${{ steps.conan.outputs.conan_profile }}
# TODO: This is not a part of build process but it is the easiest way to do it here.
# It will be refactored in https://github.com/XRPLF/clio/issues/1075
- name: Run code coverage
if: ${{ matrix.code_coverage }}
uses: ./.github/actions/code_coverage
upload_coverage_report:
name: Codecov
needs: build
uses: ./.github/workflows/upload_coverage_report.yml
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
check_config:
name: Check Config Description
needs: build-and-test
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:latest
test:
name: Run Tests
needs: build
strategy:
fail-fast: false
matrix:
include:
- os: heavy
container:
image: rippleci/clio_ci:latest
conan_profile: gcc
build_type: Release
- os: heavy
container:
image: rippleci/clio_ci:latest
conan_profile: clang
build_type: Release
- os: heavy
container:
image: rippleci/clio_ci:latest
conan_profile: clang
build_type: Debug
- os: macos14
conan_profile: apple_clang_15
build_type: Release
runs-on: [self-hosted, "${{ matrix.os }}"]
container: ${{ matrix.container }}
steps:
- uses: actions/checkout@v4
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@1.0
- uses: actions/download-artifact@v4
with:
name: clio_server_Linux_Release_gcc
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}_${{ matrix.conan_profile }}
- name: Compare Config Description
shell: bash
- name: Run clio_tests
run: |
repoConfigFile=docs/config-description.md
if ! [ -f ${repoConfigFile} ]; then
echo "Config Description markdown file is missing in docs folder"
exit 1
fi
chmod +x ./clio_server
configDescriptionFile=config_description_new.md
./clio_server -d ${configDescriptionFile}
configDescriptionHash=$(sha256sum ${configDescriptionFile} | cut -d' ' -f1)
repoConfigHash=$(sha256sum ${repoConfigFile} | cut -d' ' -f1)
if [ ${configDescriptionHash} != ${repoConfigHash} ]; then
echo "Markdown file is not up to date"
diff -u "${repoConfigFile}" "${configDescriptionFile}"
rm -f ${configDescriptionFile}
exit 1
fi
rm -f ${configDescriptionFile}
exit 0
chmod +x ./clio_tests
./clio_tests

View File

@@ -1,92 +0,0 @@
name: Reusable build and test
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
disable_cache:
description: Whether ccache and conan cache should be disabled
required: false
type: boolean
default: false
static:
description: Whether to build static binaries
required: true
type: boolean
default: true
run_unit_tests:
description: Whether to run unit tests
required: true
type: boolean
run_integration_tests:
description: Whether to run integration tests
required: true
type: boolean
default: false
upload_clio_server:
description: Whether to upload clio_server
required: true
type: boolean
targets:
description: Space-separated build target names
required: false
type: string
default: all
sanitizer:
description: Sanitizer to use
required: false
type: string
default: "false"
jobs:
build:
uses: ./.github/workflows/build_impl.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
disable_cache: ${{ inputs.disable_cache }}
code_coverage: false
static: ${{ inputs.static }}
upload_clio_server: ${{ inputs.upload_clio_server }}
targets: ${{ inputs.targets }}
sanitizer: ${{ inputs.sanitizer }}
analyze_build_time: false
test:
needs: build
uses: ./.github/workflows/test_impl.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
run_unit_tests: ${{ inputs.run_unit_tests }}
run_integration_tests: ${{ inputs.run_integration_tests }}
sanitizer: ${{ inputs.sanitizer }}

View File

@@ -1,5 +1,4 @@
name: Build and publish Clio docker image
on:
workflow_call:
inputs:
@@ -41,8 +40,7 @@ on:
jobs:
build_and_publish_image:
name: Build and publish image
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v4
@@ -89,10 +87,7 @@ jobs:
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
rippleci/clio
ghcr.io/xrplf/clio
dockerhub_repo: rippleci/clio
image_name: rippleci/clio
push_image: ${{ inputs.publish_image }}
directory: docker/clio
tags: ${{ inputs.tags }}

View File

@@ -1,206 +0,0 @@
name: Reusable build
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
disable_cache:
description: Whether ccache and conan cache should be disabled
required: false
type: boolean
code_coverage:
description: Whether to enable code coverage
required: true
type: boolean
static:
description: Whether to build static binaries
required: true
type: boolean
upload_clio_server:
description: Whether to upload clio_server
required: true
type: boolean
targets:
description: Space-separated build target names
required: true
type: string
sanitizer:
description: Sanitizer to use
required: true
type: string
analyze_build_time:
description: Whether to enable build time analysis
required: true
type: boolean
secrets:
CODECOV_TOKEN:
required: false
jobs:
build:
name: Build ${{ inputs.container != '' && 'in container' || 'natively' }}
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: ${{ inputs.disable_cache }}
- name: Setup conan
uses: ./.github/actions/setup_conan
with:
conan_profile: ${{ inputs.conan_profile }}
- name: Restore cache
if: ${{ !inputs.disable_cache }}
uses: ./.github/actions/restore_cache
id: restore_cache
with:
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
conan_profile: ${{ inputs.conan_profile }}
ccache_dir: ${{ env.CCACHE_DIR }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
- name: Run conan and cmake
uses: ./.github/actions/generate
with:
conan_profile: ${{ inputs.conan_profile }}
conan_cache_hit: ${{ !inputs.disable_cache && steps.restore_cache.outputs.conan_cache_hit }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
static: ${{ inputs.static }}
sanitizer: ${{ inputs.sanitizer }}
time_trace: ${{ inputs.analyze_build_time }}
- name: Build Clio
uses: ./.github/actions/build_clio
with:
targets: ${{ inputs.targets }}
- name: Show build time analyze report
if: ${{ inputs.analyze_build_time }}
run: |
ClangBuildAnalyzer --all build/ build_time_report.bin
ClangBuildAnalyzer --analyze build_time_report.bin > build_time_report.txt
cat build_time_report.txt
shell: bash
- name: Upload build time analyze report
if: ${{ inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
with:
name: build_time_report_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build_time_report.txt
- name: Show ccache's statistics
if: ${{ !inputs.disable_cache }}
shell: bash
id: ccache_stats
run: |
ccache -s > /tmp/ccache.stats
miss_rate=$(cat /tmp/ccache.stats | grep 'Misses' | head -n1 | sed 's/.*(\(.*\)%).*/\1/')
echo "miss_rate=${miss_rate}" >> $GITHUB_OUTPUT
cat /tmp/ccache.stats
- name: Strip unit_tests
if: inputs.sanitizer == 'false' && !inputs.code_coverage && !inputs.analyze_build_time
run: strip build/clio_tests
- name: Strip integration_tests
if: inputs.sanitizer == 'false' && !inputs.code_coverage && !inputs.analyze_build_time
run: strip build/clio_integration_tests
- name: Upload clio_server
if: inputs.upload_clio_server && !inputs.code_coverage && !inputs.analyze_build_time
uses: actions/upload-artifact@v4
with:
name: clio_server_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_server
- name: Upload clio_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_tests
- name: Upload clio_integration_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@v4
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_integration_tests
- name: Save cache
if: ${{ !inputs.disable_cache && github.ref == 'refs/heads/develop' }}
uses: ./.github/actions/save_cache
with:
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
conan_hash: ${{ steps.restore_cache.outputs.conan_hash }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
ccache_dir: ${{ env.CCACHE_DIR }}
ccache_cache_hit: ${{ steps.restore_cache.outputs.ccache_cache_hit }}
ccache_cache_miss_rate: ${{ steps.ccache_stats.outputs.miss_rate }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
conan_profile: ${{ inputs.conan_profile }}
# This is run as part of the build job, because it requires the following:
# - source code
# - generated source code (Build.cpp)
# - conan packages
# - .gcno files in build directory
#
# It's all available in the build job, but not in the test job
- name: Run code coverage
if: ${{ inputs.code_coverage }}
uses: ./.github/actions/code_coverage
# `codecov/codecov-action` will rerun `gcov` if it's available and build directory is present
# To prevent this from happening, we run this action in a separate workflow
#
# More info: https://github.com/XRPLF/clio/pull/2066
upload_coverage_report:
if: ${{ inputs.code_coverage }}
name: Codecov
needs: build
uses: ./.github/workflows/upload_coverage_report.yml
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -1,28 +1,19 @@
name: Check new libXRPL
on:
repository_dispatch:
types: [check_libxrpl]
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_PROFILE: gcc
jobs:
build:
name: Build Clio / `libXRPL ${{ github.event.client_payload.version }}`
runs-on: [self-hosted, heavy]
container:
image: ghcr.io/xrplf/clio-ci:latest
image: rippleci/clio_ci:latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
fetch-depth: 0
- name: Update libXRPL version requirement
shell: bash
@@ -32,17 +23,18 @@ jobs:
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: true
disable_ccache: true
- name: Setup conan
uses: ./.github/actions/setup_conan
id: conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
conan_profile: gcc
- name: Run conan and cmake
uses: ./.github/actions/generate
with:
conan_profile: ${{ env.CONAN_PROFILE }}
conan_profile: ${{ steps.conan.outputs.conan_profile }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
build_type: Release
@@ -63,7 +55,7 @@ jobs:
needs: build
runs-on: [self-hosted, heavy]
container:
image: ghcr.io/xrplf/clio-ci:latest
image: rippleci/clio_ci:latest
steps:
- uses: actions/download-artifact@v4
@@ -79,12 +71,10 @@ jobs:
name: Create an issue on failure
needs: [build, run_tests]
if: ${{ always() && contains(needs.*.result, 'failure') }}
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
permissions:
contents: write
issues: write
steps:
- uses: actions/checkout@v4
@@ -93,8 +83,8 @@ jobs:
env:
GH_TOKEN: ${{ github.token }}
with:
labels: "compatibility,bug"
title: "Proposed libXRPL check failed"
labels: 'compatibility,bug'
title: 'Proposed libXRPL check failed'
body: >
Clio build or tests failed against `libXRPL ${{ github.event.client_payload.version }}`.

View File

@@ -1,5 +1,4 @@
name: Check PR title
on:
pull_request:
types: [opened, edited, reopened, synchronize]
@@ -7,11 +6,13 @@ on:
jobs:
check_title:
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
# permissions:
# pull-requests: write
steps:
- uses: ytanikin/pr-conventional-commits@8267db1bacc237419f9ed0228bb9d94e94271a1d # v1.4.1
- uses: ytanikin/PRConventionalCommits@1.3.0
with:
task_types: '["build","feat","fix","docs","test","ci","style","refactor","perf","chore"]'
add_label: false
custom_labels: '{"build":"build", "feat":"enhancement", "fix":"bug", "docs":"documentation", "test":"testability", "ci":"ci", "style":"refactoring", "refactor":"refactoring", "perf":"performance", "chore":"tooling"}'
# Turned off labelling because it leads to an error, see https://github.com/ytanikin/PRConventionalCommits/issues/19
# custom_labels: '{"build":"build", "feat":"enhancement", "fix":"bug", "docs":"documentation", "test":"testability", "ci":"ci", "style":"refactoring", "refactor":"refactoring", "perf":"performance", "chore":"tooling"}'

View File

@@ -1,29 +1,20 @@
name: Clang-tidy check
on:
schedule:
- cron: "0 9 * * 1-5"
- cron: "0 6 * * 1-5"
workflow_dispatch:
pull_request:
branches: [develop]
paths:
- .clang_tidy
- .github/workflows/clang-tidy.yml
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_PROFILE: clang
workflow_call:
jobs:
clang_tidy:
runs-on: heavy
runs-on: [self-hosted, Linux]
container:
image: ghcr.io/xrplf/clio-ci:latest
image: rippleci/clio_ci:latest
permissions:
contents: write
issues: write
@@ -41,8 +32,9 @@ jobs:
- name: Setup conan
uses: ./.github/actions/setup_conan
id: conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
conan_profile: clang
- name: Restore cache
uses: ./.github/actions/restore_cache
@@ -50,12 +42,12 @@ jobs:
with:
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
ccache_dir: ${{ env.CCACHE_DIR }}
conan_profile: ${{ env.CONAN_PROFILE }}
conan_profile: ${{ steps.conan.outputs.conan_profile }}
- name: Run conan and cmake
uses: ./.github/actions/generate
with:
conan_profile: ${{ env.CONAN_PROFILE }}
conan_profile: ${{ steps.conan.outputs.conan_profile }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
build_type: Release
@@ -70,11 +62,11 @@ jobs:
run: |
run-clang-tidy-19 -p build -j ${{ steps.number_of_threads.outputs.threads_number }} -fix -quiet 1>output.txt
- name: Fix local includes
- name: Check format
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
continue-on-error: true
shell: bash
run: pre-commit run --all-files fix-local-includes
run: ./.githooks/check-format
- name: Print issues found
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
@@ -91,13 +83,13 @@ jobs:
env:
GH_TOKEN: ${{ github.token }}
with:
title: "Clang-tidy found bugs in code 🐛"
title: 'Clang-tidy found bugs in code 🐛'
body: >
Clang-tidy found issues in the code:
List of the issues found: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/
- uses: crazy-max/ghaction-import-gpg@e89d40939c28e39f97cf32126055eeae86ba74ec # v6.3.0
- uses: crazy-max/ghaction-import-gpg@v6
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
with:
gpg_private_key: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
@@ -107,7 +99,7 @@ jobs:
- name: Create PR with fixes
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@v7
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
@@ -119,7 +111,7 @@ jobs:
delete-branch: true
title: "style: clang-tidy auto fixes"
body: "Fixes #${{ steps.create_issue.outputs.created_issue_id }}. Please review and commit clang-tidy fixes."
reviewers: "godexsoft,kuznetsss,PeterChen13579"
reviewers: "cindyyan317,godexsoft,kuznetsss"
- name: Fail the job
if: ${{ steps.run_clang_tidy.outcome != 'success' }}

View File

@@ -1,5 +1,4 @@
name: Restart clang-tidy workflow
on:
push:
branches: [develop]
@@ -7,7 +6,7 @@ on:
jobs:
restart_clang_tidy:
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
permissions:
actions: write

View File

@@ -1,5 +1,4 @@
name: Documentation
on:
push:
branches: [develop]
@@ -11,8 +10,7 @@ permissions:
id-token: write
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
group: "pages"
cancel-in-progress: true
jobs:
@@ -20,22 +18,21 @@ jobs:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
continue-on-error: true
container:
image: ghcr.io/xrplf/clio-ci:latest
image: rippleci/clio_ci:latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
with:
lfs: true
- name: Build docs
run: |
mkdir -p build_docs && cd build_docs
cmake ../docs && cmake --build . --target docs
- name: Setup Pages
uses: actions/configure-pages@v5
@@ -44,7 +41,7 @@ jobs:
with:
path: build_docs/html
name: docs-develop
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4

View File

@@ -1,116 +1,205 @@
name: Nightly release
on:
schedule:
- cron: "0 8 * * 1-5"
- cron: '0 5 * * 1-5'
workflow_dispatch:
pull_request:
paths:
- ".github/workflows/nightly.yml"
- ".github/workflows/build_clio_docker_image.yml"
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
- '.github/workflows/nightly.yml'
- '.github/workflows/build_clio_docker_image.yml'
jobs:
build-and-test:
name: Build and Test
build:
name: Build clio
strategy:
fail-fast: false
matrix:
include:
- os: macos15
conan_profile: default_apple_clang
- os: macos14
build_type: Release
static: false
- os: heavy
conan_profile: gcc
build_type: Release
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:latest" }'
container:
image: rippleci/clio_ci:latest
- os: heavy
conan_profile: gcc
build_type: Debug
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:latest" }'
container:
image: rippleci/clio_ci:latest
runs-on: [self-hosted, "${{ matrix.os }}"]
container: ${{ matrix.container }}
uses: ./.github/workflows/build_and_test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: ${{ matrix.os != 'macos15' }}
upload_clio_server: true
disable_cache: true
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@1.0
analyze_build_time:
name: Analyze Build Time
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: true
- name: Setup conan
uses: ./.github/actions/setup_conan
id: conan
with:
conan_profile: gcc
- name: Run conan and cmake
uses: ./.github/actions/generate
with:
conan_profile: ${{ steps.conan.outputs.conan_profile }}
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
build_type: ${{ matrix.build_type }}
static: ${{ matrix.static }}
- name: Build Clio
uses: ./.github/actions/build_clio
- name: Strip tests
run: strip build/clio_tests && strip build/clio_integration_tests
- name: Upload clio_tests
uses: actions/upload-artifact@v4
with:
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}
path: build/clio_*tests
- name: Compress clio_server
shell: bash
run: |
cd build
tar czf ./clio_server_${{ runner.os }}_${{ matrix.build_type }}.tar.gz ./clio_server
- name: Upload clio_server
uses: actions/upload-artifact@v4
with:
name: clio_server_${{ runner.os }}_${{ matrix.build_type }}
path: build/clio_server_${{ runner.os }}_${{ matrix.build_type }}.tar.gz
run_tests:
needs: build
strategy:
fail-fast: false
matrix:
include:
# TODO: Enable when we have at least ubuntu 22.04
# as ClangBuildAnalyzer requires relatively modern glibc
#
# - os: heavy
# conan_profile: clang
# container: '{ "image": "ghcr.io/xrplf/clio-ci:latest" }'
# static: true
- os: macos15
conan_profile: default_apple_clang
container: ""
static: false
uses: ./.github/workflows/build_impl.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: Release
disable_cache: true
code_coverage: false
static: ${{ matrix.static }}
upload_clio_server: false
targets: all
sanitizer: "false"
analyze_build_time: true
- os: macos14
build_type: Release
integration_tests: false
- os: heavy
build_type: Release
container:
image: rippleci/clio_ci:latest
integration_tests: true
- os: heavy
build_type: Debug
container:
image: rippleci/clio_ci:latest
integration_tests: true
runs-on: [self-hosted, "${{ matrix.os }}"]
container: ${{ matrix.container }}
services:
scylladb:
image: ${{ (matrix.integration_tests) && 'scylladb/scylla' || '' }}
options: >-
--health-cmd "cqlsh -e 'describe cluster'"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@1.0
- uses: actions/download-artifact@v4
with:
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}
- name: Run clio_tests
run: |
chmod +x ./clio_tests
./clio_tests
# To be enabled back once docker in mac runner arrives
# https://github.com/XRPLF/clio/issues/1400
- name: Run clio_integration_tests
if: matrix.integration_tests
run: |
chmod +x ./clio_integration_tests
./clio_integration_tests --backend_host=scylladb
nightly_release:
needs: build-and-test
uses: ./.github/workflows/release_impl.yml
with:
overwrite_release: true
title: "Clio development (nightly) build"
version: nightly
notes_header_file: nightly_notes.md
if: ${{ github.event_name != 'pull_request' }}
needs: run_tests
runs-on: ubuntu-20.04
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
permissions:
contents: write
steps:
- uses: actions/checkout@v4
- uses: actions/download-artifact@v4
with:
path: nightly_release
pattern: clio_server_*
- name: Prepare files
shell: bash
run: |
cp ${{ github.workspace }}/.github/workflows/nightly_notes.md "${RUNNER_TEMP}/nightly_notes.md"
cd nightly_release
for d in $(ls); do
archive_name=$(ls $d)
mv ${d}/${archive_name} ./
rm -r $d
sha256sum ./$archive_name > ./${archive_name}.sha256sum
cat ./$archive_name.sha256sum >> "${RUNNER_TEMP}/nightly_notes.md"
done
echo '```' >> "${RUNNER_TEMP}/nightly_notes.md"
- name: Remove current nightly release and nightly tag
shell: bash
run: |
gh release delete nightly --yes || true
git push origin :nightly || true
- name: Publish nightly release
shell: bash
run: |
gh release create nightly --prerelease --title "Clio development (nightly) build" \
--target $GITHUB_SHA --notes-file "${RUNNER_TEMP}/nightly_notes.md" \
./nightly_release/clio_server*
build_and_publish_docker_image:
uses: ./.github/workflows/build_clio_docker_image.yml
needs: build-and-test
needs: run_tests
secrets: inherit
with:
tags: |
type=raw,value=nightly
type=raw,value=${{ github.sha }}
artifact_name: clio_server_Linux_Release_gcc
type=raw,value=nightly
type=raw,value=${{ github.sha }}
artifact_name: clio_server_Linux_Release
strip_binary: true
publish_image: ${{ github.event_name != 'pull_request' }}
create_issue_on_failure:
needs: [build-and-test, nightly_release, build_and_publish_docker_image]
needs: [build, run_tests, nightly_release, build_and_publish_docker_image]
if: ${{ always() && contains(needs.*.result, 'failure') && github.event_name != 'pull_request' }}
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
permissions:
contents: write
issues: write
steps:
- uses: actions/checkout@v4
@@ -119,7 +208,7 @@ jobs:
env:
GH_TOKEN: ${{ github.token }}
with:
title: "Nightly release failed 🌙"
title: 'Nightly release failed 🌙'
body: >
Nightly release failed:

View File

@@ -1,7 +1,6 @@
# Release notes
> **Note:** Please remember that this is a development release and it is not recommended for production use.
Changelog (including previous releases): <https://github.com/XRPLF/clio/commits/nightly>
Changelog (including previous releases): https://github.com/XRPLF/clio/commits/nightly
## SHA256 checksums
```

View File

@@ -1,39 +0,0 @@
name: Pre-commit auto-update
on:
# every first day of the month
schedule:
- cron: "0 0 1 * *"
# on demand
workflow_dispatch:
jobs:
auto-update:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: 3.x
- run: pip install pre-commit
- run: pre-commit autoupdate
- run: pre-commit run --all-files
- uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
if: always()
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
with:
branch: update/pre-commit-hooks
title: Update pre-commit hooks
commit-message: "style: update pre-commit hooks"
body: Update versions of pre-commit hooks to latest version.
reviewers: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"

View File

@@ -1,28 +0,0 @@
name: Run pre-commit hooks
on:
pull_request:
push:
branches:
- develop
workflow_dispatch:
jobs:
run-hooks:
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:latest
steps:
- name: Checkout Repo ⚡️
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Prepare runner
uses: ./.github/actions/prepare_runner
with:
disable_ccache: true
- name: Run pre-commit ✅
run: pre-commit run --all-files

View File

@@ -1,78 +0,0 @@
name: Make release
on:
workflow_call:
inputs:
overwrite_release:
description: "Overwrite the current release and tag"
required: true
type: boolean
title:
description: "Release title"
required: true
type: string
version:
description: "Release version"
required: true
type: string
notes_header_file:
description: "Release notes header file"
required: true
type: string
jobs:
release:
runs-on: ubuntu-latest
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
permissions:
contents: write
steps:
- uses: actions/checkout@v4
- uses: actions/download-artifact@v4
with:
path: release_artifacts
pattern: clio_server_*
- name: Prepare files
shell: bash
working-directory: release_artifacts
run: |
cp ${{ github.workspace }}/.github/workflows/${{ inputs.notes_header_file }} "${RUNNER_TEMP}/release_notes.md"
echo '' >> "${RUNNER_TEMP}/release_notes.md"
echo '```' >> "${RUNNER_TEMP}/release_notes.md"
for d in $(ls); do
archive_name=$(ls $d)
mv ${d}/${archive_name} ./
rm -r $d
sha256sum ./$archive_name > ./${archive_name}.sha256sum
cat ./$archive_name.sha256sum >> "${RUNNER_TEMP}/release_notes.md"
done
echo '```' >> "${RUNNER_TEMP}/release_notes.md"
- name: Remove current release and tag
if: ${{ github.event_name != 'pull_request' && inputs.overwrite_release }}
shell: bash
run: |
gh release delete ${{ inputs.version }} --yes || true
git push origin :${{ inputs.version }} || true
- name: Publish release
if: ${{ github.event_name != 'pull_request' }}
shell: bash
run: |
gh release create ${{ inputs.version }} \
${{ inputs.overwrite_release && '--prerelease' || '' }} \
--title "${{ inputs.title }}" \
--target $GITHUB_SHA \
--notes-file "${RUNNER_TEMP}/release_notes.md" \
./release_artifacts/clio_server*

View File

@@ -1,43 +0,0 @@
name: Run tests with sanitizers
on:
schedule:
- cron: "0 4 * * 1-5"
workflow_dispatch:
pull_request:
paths:
- ".github/workflows/sanitizers.yml"
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
strategy:
fail-fast: false
matrix:
include:
- sanitizer: tsan
compiler: gcc
- sanitizer: asan
compiler: gcc
- sanitizer: ubsan
compiler: gcc
uses: ./.github/workflows/build_and_test.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:latest" }'
disable_cache: true
conan_profile: ${{ matrix.compiler }}.${{ matrix.sanitizer }}
build_type: Release
static: false
run_unit_tests: true
run_integration_tests: false
upload_clio_server: false
targets: clio_tests clio_integration_tests
sanitizer: ${{ matrix.sanitizer }}

View File

@@ -1,138 +0,0 @@
name: Reusable test
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
run_unit_tests:
description: Whether to run unit tests
required: true
type: boolean
run_integration_tests:
description: Whether to run integration tests
required: true
type: boolean
sanitizer:
description: Sanitizer to use
required: true
type: string
jobs:
unit_tests:
name: Unit testing ${{ inputs.container != '' && 'in container' || 'natively' }}
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: inputs.run_unit_tests
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/download-artifact@v4
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
- name: Make clio_tests executable
shell: bash
run: chmod +x ./clio_tests
- name: Run clio_tests (regular)
if: inputs.sanitizer == 'false'
run: ./clio_tests
- name: Run clio_tests (sanitizer)
if: inputs.sanitizer != 'false'
run: ./.github/scripts/execute-tests-under-sanitizer ./clio_tests
- name: Check for sanitizer report
if: inputs.sanitizer != 'false'
shell: bash
id: check_report
run: |
if ls .sanitizer-report/* 1> /dev/null 2>&1; then
echo "found_report=true" >> $GITHUB_OUTPUT
else
echo "found_report=false" >> $GITHUB_OUTPUT
fi
- name: Upload sanitizer report
if: inputs.sanitizer != 'false' && steps.check_report.outputs.found_report == 'true'
uses: actions/upload-artifact@v4
with:
name: ${{ inputs.conan_profile }}_report
path: .sanitizer-report/*
include-hidden-files: true
# TODO: enable when we have fixed all currently existing issues from sanitizers
- name: Create an issue
if: false && inputs.sanitizer != 'false' && steps.check_report.outputs.found_report == 'true'
uses: ./.github/actions/create_issue
env:
GH_TOKEN: ${{ github.token }}
with:
labels: "bug"
title: "[${{ inputs.conan_profile }}] reported issues"
body: >
Clio tests failed one or more sanitizer checks when built with ${{ inputs.conan_profile }}`.
Workflow: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/
Reports are available as artifacts.
integration_tests:
name: Integration testing ${{ inputs.container != '' && 'in container' || 'natively' }}
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: inputs.run_integration_tests
services:
scylladb:
image: "scylladb/scylla"
options: >-
--health-cmd "cqlsh -e 'describe cluster'"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Clean workdir
if: ${{ runner.os == 'macOS' }}
uses: kuznetsss/workspace-cleanup@80b9863b45562c148927c3d53621ef354e5ae7ce # v1.0
- uses: actions/download-artifact@v4
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
# To be enabled back once docker in mac runner arrives
# https://github.com/XRPLF/clio/issues/1400
- name: Run clio_integration_tests
run: |
chmod +x ./clio_integration_tests
./clio_integration_tests --backend_host=scylladb

View File

@@ -1,33 +1,22 @@
name: Update CI docker image
on:
pull_request:
paths:
- "docker/ci/**"
- "docker/compilers/**"
- 'docker/ci/**'
- 'docker/compilers/**'
- .github/workflows/update_docker_ci.yml
- ".github/actions/build_docker_image/**"
push:
branches: [develop]
paths:
# CI image must update when either its Dockerfile changes
# or any compilers changed and were pushed by hand
- "docker/ci/**"
- "docker/compilers/**"
- 'docker/ci/**' # CI image must update when either its dockerfile changes
- 'docker/compilers/**' # or any compilers changed and were pushed by hand
- .github/workflows/update_docker_ci.yml
- ".github/actions/build_docker_image/**"
workflow_dispatch:
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build_and_push:
name: Build and push docker image
runs-on: [self-hosted, heavy]
steps:
- uses: actions/checkout@v4
- uses: ./.github/actions/build_docker_image
@@ -36,10 +25,7 @@ jobs:
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
rippleci/clio_ci
ghcr.io/xrplf/clio-ci
dockerhub_repo: rippleci/clio_ci
image_name: rippleci/clio_ci
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/ci
tags: |

View File

@@ -1,5 +1,4 @@
name: Upload report
on:
workflow_dispatch:
workflow_call:
@@ -10,8 +9,7 @@ on:
jobs:
upload_report:
name: Upload report
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v4
with:
@@ -25,9 +23,13 @@ jobs:
- name: Upload coverage report
if: ${{ hashFiles('build/coverage_report.xml') != '' }}
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
uses: wandalen/wretry.action@v3.7.2
with:
files: build/coverage_report.xml
fail_ci_if_error: true
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
action: codecov/codecov-action@v4
with: |
files: build/coverage_report.xml
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
attempt_limit: 5
attempt_delay: 10000

1
.gitignore vendored
View File

@@ -6,7 +6,6 @@
.vscode
.python-version
.DS_Store
.sanitizer-report
CMakeUserPresets.json
config.json
src/util/build/Build.cpp

View File

@@ -1,6 +0,0 @@
# Default state for all rules
default: true
# MD013/line-length - Line length
MD013:
line_length: 1000

View File

@@ -1,118 +0,0 @@
---
# pre-commit is a tool to perform a predefined set of tasks manually and/or
# automatically before git commits are made.
#
# Config reference: https://pre-commit.com/#pre-commit-configyaml---top-level
#
# Common tasks
#
# - Run on all files: pre-commit run --all-files
# - Register git hooks: pre-commit install --install-hooks
#
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
# `pre-commit sample-config` default hooks
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-added-large-files
- id: check-executables-have-shebangs
- id: check-shebang-scripts-are-executable
- id: end-of-file-fixer
exclude: ^docs/doxygen-awesome-theme/
- id: trailing-whitespace
exclude: ^docs/doxygen-awesome-theme/
# Autoformat: YAML, JSON, Markdown, etc.
- repo: https://github.com/rbubley/mirrors-prettier
rev: v3.5.3
hooks:
- id: prettier
exclude: ^docs/doxygen-awesome-theme/
- repo: https://github.com/igorshubovych/markdownlint-cli
rev: v0.44.0
hooks:
- id: markdownlint-fix
exclude: LICENSE.md
- repo: https://github.com/crate-ci/typos
rev: v1.31.2
hooks:
- id: typos
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v19.1.7
hooks:
- id: clang-format
args: [--style=file]
types: [c++]
- repo: https://github.com/cheshirekow/cmake-format-precommit
rev: v0.6.13
hooks:
- id: cmake-format
additional_dependencies: [PyYAML]
- repo: local
hooks:
- id: check-no-h-files
name: No .h files
entry: There should be no .h files in this repository
language: fail
files: \.h$
- repo: local
hooks:
- id: gofmt
name: Go Format
entry: pre-commit-hooks/run-go-fmt.sh
types: [go]
language: golang
description: "Runs `gofmt`, requires golang"
- id: check-docs
name: Check Doxygen Documentation
entry: pre-commit-hooks/check-doxygen-docs.sh
types: [text]
language: system
pass_filenames: false
- id: fix-local-includes
name: Fix Local Includes
entry: pre-commit-hooks/fix-local-includes.sh
types: [c++]
language: system
pass_filenames: false
- id: verify-commits
name: Verify Commits
entry: pre-commit-hooks/verify-commits.sh
types: [text]
language: system
pass_filenames: false
- repo: local
hooks:
- id: lfs-post-checkout
name: LFS Post Checkout
entry: pre-commit-hooks/lfs/post-checkout
types: [text]
stages: [post-checkout]
language: system
- id: lfs-post-commit
name: LFS Post Commit
entry: pre-commit-hooks/lfs/post-commit
types: [text]
stages: [post-commit]
language: system
- id: lfs-post-merge
name: LFS Post Merge
entry: pre-commit-hooks/lfs/post-merge
types: [text]
stages: [post-merge]
language: system
- id: lfs-pre-push
name: LFS Pre Push
entry: pre-commit-hooks/lfs/pre-push
types: [text]
stages: [pre-push]
language: system

View File

@@ -16,9 +16,6 @@ option(coverage "Build test coverage report" FALSE)
option(packaging "Create distribution packages" FALSE)
option(lint "Run clang-tidy checks during compilation" FALSE)
option(static "Statically linked Clio" FALSE)
option(snapshot "Build snapshot tool" FALSE)
option(time_trace "Build using -ftime-trace to create compiler trace reports" FALSE)
# ========================================================================== #
set(san "" CACHE STRING "Add sanitizer instrumentation")
set(CMAKE_EXPORT_COMPILE_COMMANDS TRUE)
@@ -68,21 +65,15 @@ endif ()
# Enable selected sanitizer if enabled via `san`
if (san)
set(SUPPORTED_SANITIZERS "address" "thread" "memory" "undefined")
list(FIND SUPPORTED_SANITIZERS "${san}" INDEX)
if (INDEX EQUAL -1)
message(FATAL_ERROR "Error: Unsupported sanitizer '${san}'. Supported values are: ${SUPPORTED_SANITIZERS}.")
endif ()
target_compile_options(
clio_options INTERFACE # Sanitizers recommend minimum of -O1 for reasonable performance
$<$<CONFIG:Debug>:-O1> ${SAN_FLAG} -fno-omit-frame-pointer
clio PUBLIC # Sanitizers recommend minimum of -O1 for reasonable performance
$<$<CONFIG:Debug>:-O1> ${SAN_FLAG} -fno-omit-frame-pointer
)
target_compile_definitions(
clio_options INTERFACE $<$<STREQUAL:${san},address>:SANITIZER=ASAN> $<$<STREQUAL:${san},thread>:SANITIZER=TSAN>
$<$<STREQUAL:${san},memory>:SANITIZER=MSAN> $<$<STREQUAL:${san},undefined>:SANITIZER=UBSAN>
clio PUBLIC $<$<STREQUAL:${san},address>:SANITIZER=ASAN> $<$<STREQUAL:${san},thread>:SANITIZER=TSAN>
$<$<STREQUAL:${san},memory>:SANITIZER=MSAN> $<$<STREQUAL:${san},undefined>:SANITIZER=UBSAN>
)
target_link_libraries(clio_options INTERFACE ${SAN_FLAG} ${SAN_LIB})
target_link_libraries(clio INTERFACE ${SAN_FLAG} ${SAN_LIB})
endif ()
# Generate `docs` target for doxygen documentation if enabled Note: use `make docs` to generate the documentation
@@ -94,7 +85,3 @@ include(install/install)
if (packaging)
include(cmake/packaging.cmake) # This file exists only in build runner
endif ()
if (snapshot)
add_subdirectory(tools/snapshot)
endif ()

View File

@@ -1,47 +1,39 @@
# Contributing
Thank you for your interest in contributing to the `clio` project 🙏
## Workflow
To contribute, please:
1. Fork the repository under your own user.
2. Create a new branch on which to commit/push your changes.
3. Write and test your code.
4. Ensure that your code compiles with the provided build engine and update the provided build engine as part of your PR where needed and where appropriate.
5. Where applicable, write test cases for your code and include those in the relevant subfolder under `tests`.
6. Ensure your code passes [automated checks](#pre-commit-hooks)
6. Ensure your code passes automated checks (e.g. clang-format)
7. Squash your commits (i.e. rebase) into as few commits as is reasonable to describe your changes at a high level (typically a single commit for a small change). See below for more details.
8. Open a PR to the main repository onto the _develop_ branch, and follow the provided template.
> **Note:** Please read the [Style guide](#style-guide).
### `pre-commit` hooks
## Install git hooks
Please run the following command in order to use git hooks that are helpful for `clio` development.
To ensure code quality and style, we use [`pre-commit`](https://pre-commit.com/).
Run the following command to enable `pre-commit` hooks that help with Clio development:
```bash
pip3 install pre-commit
pre-commit install
``` bash
git config --local core.hooksPath .githooks
```
`pre-commit` takes care of running each tool in [`.pre-commit-config.yaml`](https://github.com/XRPLF/clio/blob/develop/.pre-commit-config.yaml) in a separate environment.
## Git hooks dependencies
The pre-commit hook requires `clang-format >= 19.0.0` and `cmake-format` to be installed on your machine.
`clang-format` can be installed using `brew` on macOS and default package manager on Linux.
`cmake-format` can be installed using `pip`.
The hook will also attempt to automatically use `doxygen` to verify that everything public in the codebase is covered by doc comments. If `doxygen` is not installed, the hook will raise a warning suggesting to install `doxygen` for future commits.
`pre-commit` also attempts to automatically use Doxygen to verify that everything public in the codebase has doc comments.
If Doxygen is not installed, the hook issues a warning and recommends installing Doxygen for future commits.
### Git commands
This sections offers a detailed look at the git commands you will need to use to get your PR submitted.
## Git commands
This sections offers a detailed look at the git commands you will need to use to get your PR submitted.
Please note that there are more than one way to do this and these commands are provided for your convenience.
At this point it's assumed that you have already finished working on your feature/bug.
> **Important:** Before you issue any of the commands below, please hit the `Sync fork` button and make sure your fork's `develop` branch is up-to-date with the main `clio` repository.
```bash
``` bash
# Create a backup of your branch
git branch <your feature branch>_bk
@@ -51,20 +43,18 @@ git pull origin develop
git checkout <your feature branch>
git rebase -i develop
```
For each commit in the list other than the first one, enter `s` to squash.
After this is done, you will have the opportunity to write a message for the squashed commit.
> **Hint:** Please use **imperative mood** in the commit message, and capitalize the first word.
```bash
``` bash
# You should now have a single commit on top of a commit in `develop`
git log
```
> **Note:** If there are merge conflicts, please resolve them now.
```bash
``` bash
# Use the same commit message as you did above
git commit -m 'Your message'
git rebase --continue
@@ -72,30 +62,27 @@ git rebase --continue
> **Important:** If you have no GPG keys set up, please follow [this tutorial](https://docs.github.com/en/authentication/managing-commit-signature-verification/adding-a-gpg-key-to-your-github-account)
```bash
``` bash
# Sign the commit with your GPG key, and push your changes
git commit --amend -S
git push --force
```
### Use ccache (optional)
## Use ccache (optional)
Clio uses `ccache` to speed up compilation. If you want to use it, please make sure it is installed on your machine.
CMake will automatically detect it and use it if it is available.
### Opening a pull request
## Opening a pull request
When a pull request is open CI will perform checks on the new code.
Title of the pull request and squashed commit should follow [conventional commits specification](https://www.conventionalcommits.org/en/v1.0.0/).
### Fixing issues found during code review
## Fixing issues found during code review
While your code is in review, it's possible that some changes will be requested by reviewer(s).
This section describes the process of adding your fixes.
We assume that you already made the required changes on your feature branch.
```bash
``` bash
# Add the changed code
git add <paths to add>
@@ -107,72 +94,62 @@ git commit -S -m "[FOLD] Your commit message"
git push
```
### After code review
## After code review
When your PR is approved and ready to merge, use `Squash and merge`.
The button for that is near the bottom of the PR's page on GitHub.
> **Important:** Please leave the automatically-generated mention/link to the PR in the subject line **and** in the description field add `"Fix #ISSUE_ID"` (replacing `ISSUE_ID` with yours) if the PR fixes an issue.
> **Note:** See [issues](https://github.com/XRPLF/clio/issues) to find the `ISSUE_ID` for the feature/bug you were working on.
## Style guide
# Style guide
This is a non-exhaustive list of recommended style guidelines. These are not always strictly enforced and serve as a way to keep the codebase coherent.
### Formatting
Code must conform to `clang-format`, unless the result is unreasonably difficult to read or maintain.
In most cases the `pre-commit` hook takes care of formatting and fixes any issues automatically.
To manually format your code, run `pre-commit run clang-format --files <your changed files>` for C++ files, and `pre-commit run cmake-format --files <your changed files>` for CMake files.
### Documentation
## Formatting
Code must conform to `clang-format` version 19, unless the result would be unreasonably difficult to read or maintain.
In most cases the pre-commit hook will take care of formatting and will fix any issues automatically.
To manually format your code, use `clang-format -i <your changed files>` for C++ files and `cmake-format -i <your changed files>` for CMake files.
## Documentation
All public namespaces, classes and functions must be covered by doc (`doxygen`) comments. Everything that is not within a nested `impl` namespace is considered public.
> **Note:** Keep in mind that this is enforced by Clio's CI and your build will fail if newly added public code lacks documentation.
### Avoid
## Avoid
* Proliferation of nearly identical code.
* Proliferation of new files and classes unless it improves readability or/and compilation time.
* Unmanaged memory allocation and raw pointers.
* Macros (unless they add significant value.)
* Lambda patterns (unless these add significant value.)
* CPU or architecture-specific code unless there is a good reason to include it, and where it is used guard it with macros and provide explanatory comments.
* Importing new libraries unless there is a very good reason to do so.
- Proliferation of nearly identical code.
- Proliferation of new files and classes unless it improves readability or/and compilation time.
- Unmanaged memory allocation and raw pointers.
- Macros (unless they add significant value.)
- Lambda patterns (unless these add significant value.)
- CPU or architecture-specific code unless there is a good reason to include it, and where it is used guard it with macros and provide explanatory comments.
- Importing new libraries unless there is a very good reason to do so.
### Seek to
- Extend functionality of existing code rather than creating new code.
- Prefer readability over terseness where important logic is concerned.
- Inline functions that are not used or are not likely to be used elsewhere in the codebase.
- Use clear and self-explanatory names for functions, variables, structs and classes.
- Use TitleCase for classes, structs and filenames, camelCase for function and variable names, lower case for namespaces and folders.
- Provide as many comments as you feel that a competent programmer would need to understand what your code does.
## Maintainers
## Seek to
* Extend functionality of existing code rather than creating new code.
* Prefer readability over terseness where important logic is concerned.
* Inline functions that are not used or are not likely to be used elsewhere in the codebase.
* Use clear and self-explanatory names for functions, variables, structs and classes.
* Use TitleCase for classes, structs and filenames, camelCase for function and variable names, lower case for namespaces and folders.
* Provide as many comments as you feel that a competent programmer would need to understand what your code does.
# Maintainers
Maintainers are ecosystem participants with elevated access to the repository. They are able to push new code, make decisions on when a release should be made, etc.
### Code Review
## Code Review
A PR must be reviewed and approved by at least one of the maintainers before it can be merged.
### Adding and Removing
## Adding and Removing
New maintainers can be proposed by two existing maintainers, subject to a vote by a quorum of the existing maintainers. A minimum of 50% support and a 50% participation is required. In the event of a tie vote, the addition of the new maintainer will be rejected.
Existing maintainers can resign, or be subject to a vote for removal at the behest of two existing maintainers. A minimum of 60% agreement and 50% participation are required. The XRP Ledger Foundation will have the ability, for cause, to remove an existing maintainer without a vote.
### Existing Maintainers
## Existing Maintainers
- [godexsoft](https://github.com/godexsoft) (Ripple)
- [kuznetsss](https://github.com/kuznetsss) (Ripple)
- [legleux](https://github.com/legleux) (Ripple)
- [PeterChen13579](https://github.com/PeterChen13579) (Ripple)
* [cindyyan317](https://github.com/cindyyan317) (Ripple)
* [godexsoft](https://github.com/godexsoft) (Ripple)
* [kuznetsss](https://github.com/kuznetsss) (Ripple)
* [legleux](https://github.com/legleux) (Ripple)
### Honorable ex-Maintainers
## Honorable ex-Maintainers
- [cindyyan317](https://github.com/cindyyan317) (ex-Ripple)
- [cjcobb23](https://github.com/cjcobb23) (ex-Ripple)
- [natenichols](https://github.com/natenichols) (ex-Ripple)
* [cjcobb23](https://github.com/cjcobb23) (ex-Ripple)
* [natenichols](https://github.com/natenichols) (ex-Ripple)

View File

@@ -1,7 +1,8 @@
ISC License
Copyright (c) 2022, the clio developers
Copyright (c) 2022, the clio developers
Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

View File

@@ -1,4 +1,4 @@
# <img src='./docs/img/xrpl-logo.svg' width='40' valign="top" /> Clio <!-- markdownlint-disable-line MD033 MD045 -->
# <img src='./docs/img/xrpl-logo.svg' width='40' valign="top" /> Clio
[![Build status](https://github.com/XRPLF/clio/actions/workflows/build.yml/badge.svg?branch=develop)](https://github.com/XRPLF/clio/actions/workflows/build.yml?query=branch%3Adevelop)
[![Nightly release status](https://github.com/XRPLF/clio/actions/workflows/nightly.yml/badge.svg?branch=develop)](https://github.com/XRPLF/clio/actions/workflows/nightly.yml?query=branch%3Adevelop)
@@ -16,9 +16,9 @@ Multiple Clio nodes can share access to the same dataset, which allows for a hig
Clio offers the full `rippled` API, with the caveat that Clio by default only returns validated data. This means that `ledger_index` defaults to `validated` instead of `current` for all requests. Other non-validated data, such as information about queued transactions, is also not returned.
Clio retrieves data from a designated group of `rippled` nodes instead of connecting to the peer-to-peer network.
For requests that require access to the peer-to-peer network, such as `fee` or `submit`, Clio automatically forwards the request to a `rippled` node and propagates the response back to the client. To access non-validated data for _any_ request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to `rippled`.
For requests that require access to the peer-to-peer network, such as `fee` or `submit`, Clio automatically forwards the request to a `rippled` node and propagates the response back to the client. To access non-validated data for *any* request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to `rippled`.
> [!NOTE]
> [!NOTE]
> Clio requires access to at least one `rippled` node, which can run on the same machine as Clio or separately.
## 📚 Learn more about Clio
@@ -28,6 +28,7 @@ Below are some useful docs to learn more about Clio.
**For Developers**:
- [How to build Clio](./docs/build-clio.md)
- [Metrics and static analysis](./docs/metrics-and-static-analysis.md)
- [Coverage report](./docs/coverage-report.md)
**For Operators**:

View File

@@ -1,22 +0,0 @@
[default]
# This allows to ignore account ids in tests and private keys
# More info: https://github.com/crate-ci/typos/issues/415
extend-ignore-re = [
"[a-z-A-Z0-9]{33}",
"[a-z-A-Z0-9]{34}",
"[a-z-A-Z0-9]{64}",
]
[default.extend-identifiers]
# (S)tring
tring = "tring"
trings = "trings"
ASSERTs = "ASSERTs"
EXCLUDEs = "EXCLUDEs"
ser = "ser"
[default.extend-words]
strat = "strat"
datas = "datas"

View File

@@ -188,10 +188,10 @@ public:
static auto
generateData()
{
constexpr auto kTOTAL = 10'000;
constexpr auto TOTAL = 10'000;
std::vector<uint64_t> data;
data.reserve(kTOTAL);
for (auto i = 0; i < kTOTAL; ++i)
data.reserve(TOTAL);
for (auto i = 0; i < TOTAL; ++i)
data.push_back(util::Random::uniform(1, 100'000'000));
return data;
@@ -208,7 +208,7 @@ benchmarkThreads(benchmark::State& state)
}
template <typename CtxType>
static void
void
benchmarkExecutionContextBatched(benchmark::State& state)
{
auto data = generateData();
@@ -219,7 +219,7 @@ benchmarkExecutionContextBatched(benchmark::State& state)
}
template <typename CtxType>
static void
void
benchmarkAnyExecutionContextBatched(benchmark::State& state)
{
auto data = generateData();

View File

@@ -1,92 +0,0 @@
# git-cliff ~ default configuration file
# https://git-cliff.org/docs/configuration
#
# Lines starting with "#" are comments.
# Configuration options are organized into tables and keys.
# See documentation for more information on available options.
[changelog]
# template for the changelog header
header = """
# Changelog\n
All notable changes to this project will be documented in this file.\n
"""
# template for the changelog body
# https://keats.github.io/tera/docs/#introduction
body = """
{% if version %}\
## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
{% else %}\
## [unreleased]
{% endif %}\
{% for group, commits in commits | filter(attribute="merge_commit", value=false) | group_by(attribute="group") %}
### {{ group | striptags | trim | upper_first }}
{% for commit in commits %}
- {% if commit.scope %}*({{ commit.scope }})* {% endif %}\
{% if commit.breaking %}[**breaking**] {% endif %}\
{{ commit.message | upper_first }} {% if commit.remote.username %}by @{{ commit.remote.username }}{% endif %}\
{% endfor %}
{% endfor %}\n
"""
# template for the changelog footer
footer = """
<!-- generated by git-cliff -->
"""
# remove the leading and trailing s
trim = true
# postprocessors
postprocessors = [
# { pattern = '<REPO>', replace = "https://github.com/orhun/git-cliff" }, # replace repository URL
]
# render body even when there are no releases to process
# render_always = true
# output file path
output = "CHANGELOG.md"
[git]
# parse the commits based on https://www.conventionalcommits.org
conventional_commits = true
# filter out the commits that are not conventional
filter_unconventional = true
# process each line of a commit as an individual commit
split_commits = false
# regex for preprocessing the commit messages
commit_preprocessors = [
# Replace issue numbers
#{ pattern = '\((\w+\s)?#([0-9]+)\)', replace = "([#${2}](<REPO>/issues/${2}))"},
# Check spelling of the commit with https://github.com/crate-ci/typos
# If the spelling is incorrect, it will be automatically fixed.
#{ pattern = '.*', replace_command = 'typos --write-changes -' },
]
# regex for parsing and grouping commits
commit_parsers = [
{ message = "^feat", group = "<!-- 0 -->🚀 Features" },
{ message = "^fix", group = "<!-- 1 -->🐛 Bug Fixes" },
{ message = "^doc", group = "<!-- 3 -->📚 Documentation" },
{ message = "^perf", group = "<!-- 4 -->⚡ Performance" },
{ message = "^refactor", group = "<!-- 2 -->🚜 Refactor" },
{ message = "^style.*[Cc]lang-tidy auto fixes", skip = true },
{ message = "^style", group = "<!-- 5 -->🎨 Styling" },
{ message = "^test", group = "<!-- 6 -->🧪 Testing" },
{ message = "^chore\\(release\\): prepare for", skip = true },
{ message = "^chore: Commits", skip = true },
{ message = "^chore\\(deps.*\\)", skip = true },
{ message = "^chore\\(pr\\)", skip = true },
{ message = "^chore\\(pull\\)", skip = true },
{ message = "^chore|^ci", group = "<!-- 7 -->⚙️ Miscellaneous Tasks" },
{ body = ".*security", group = "<!-- 8 -->🛡️ Security" },
{ message = "^revert", group = "<!-- 9 -->◀️ Revert" },
{ message = ".*", group = "<!-- 10 -->💼 Other" },
]
# filter out the commits that are not matched by commit parsers
filter_commits = false
# sort the tags topologically
topo_order = false
# sort the commits inside sections by oldest/newest order
sort_commits = "oldest"
ignore_tags = "^.*-[b|rc].*"
[remote.github]
owner = "XRPLF"
repo = "clio"

View File

@@ -23,19 +23,19 @@
namespace util::build {
static constexpr char versionString[] = "@CLIO_VERSION@"; // NOLINT(readability-identifier-naming)
static constexpr char versionString[] = "@CLIO_VERSION@";
std::string const&
getClioVersionString()
{
static std::string const value = versionString; // NOLINT(readability-identifier-naming)
static std::string const value = versionString;
return value;
}
std::string const&
getClioFullVersionString()
{
static std::string const value = "clio-" + getClioVersionString(); // NOLINT(readability-identifier-naming)
static std::string const value = "clio-" + getClioVersionString();
return value;
}

View File

@@ -39,43 +39,7 @@ if (is_appleclang)
list(APPEND COMPILER_FLAGS -Wreorder-init-list)
endif ()
if (san)
# When building with sanitizers some compilers will actually produce extra warnings/errors. We don't want this yet, at
# least not until we have fixed all runtime issues reported by the sanitizers. Once that is done we can start removing
# some of these and trying to fix it in our codebase. We can never remove all of below because most of them are
# reported from deep inside libraries like boost or libxrpl.
#
# TODO: Address in https://github.com/XRPLF/clio/issues/1885
list(
APPEND
COMPILER_FLAGS
-Wno-error=tsan # Disables treating TSAN warnings as errors
-Wno-tsan # Disables TSAN warnings (thread-safety analysis)
-Wno-uninitialized # Disables warnings about uninitialized variables (AddressSanitizer, UndefinedBehaviorSanitizer,
# etc.)
-Wno-stringop-overflow # Disables warnings about potential string operation overflows (AddressSanitizer)
-Wno-unsafe-buffer-usage # Disables warnings about unsafe memory operations (AddressSanitizer)
-Wno-frame-larger-than # Disables warnings about stack frame size being too large (AddressSanitizer)
-Wno-unused-function # Disables warnings about unused functions (LeakSanitizer, memory-related issues)
-Wno-unused-but-set-variable # Disables warnings about unused variables (MemorySanitizer)
-Wno-thread-safety-analysis # Disables warnings related to thread safety usage (ThreadSanitizer)
-Wno-thread-safety # Disables warnings related to thread safety usage (ThreadSanitizer)
-Wno-sign-compare # Disables warnings about signed/unsigned comparison (UndefinedBehaviorSanitizer)
-Wno-nonnull # Disables warnings related to null pointer dereferencing (UndefinedBehaviorSanitizer)
-Wno-address # Disables warnings about address-related issues (UndefinedBehaviorSanitizer)
-Wno-array-bounds # Disables array bounds checks (UndefinedBehaviorSanitizer)
)
endif ()
# See https://github.com/cpp-best-practices/cppbestpractices/blob/master/02-Use_the_Tools_Available.md#gcc--clang for
# the flags description
if (time_trace)
if (is_clang OR is_appleclang)
list(APPEND COMPILER_FLAGS -ftime-trace)
else ()
message(FATAL_ERROR "Clang or AppleClang is required to use `-ftime-trace`")
endif ()
endif ()
target_compile_options(clio_options INTERFACE ${COMPILER_FLAGS})

View File

@@ -1,11 +1,3 @@
if ("${san}" STREQUAL "")
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_LINK)
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_USE_BACKTRACE)
find_package(libbacktrace REQUIRED CONFIG)
else ()
# Some sanitizers (TSAN and ASAN for sure) can't be used with libbacktrace because they have their own backtracing
# capabilities and there are conflicts. In any case, this makes sure Clio code knows that backtrace is not available.
# See relevant conan profiles for sanitizers where we disable stacktrace in Boost explicitly.
target_compile_definitions(clio_options INTERFACE CLIO_WITHOUT_STACKTRACE)
message(STATUS "Sanitizer enabled, disabling stacktrace")
endif ()
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_LINK)
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_USE_BACKTRACE)
find_package(libbacktrace REQUIRED CONFIG)

View File

@@ -1,7 +1,6 @@
from conan import ConanFile
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
class Clio(ConanFile):
name = 'clio'
license = 'ISC'
@@ -20,18 +19,16 @@ class Clio(ConanFile):
'packaging': [True, False], # create distribution packages
'coverage': [True, False], # build for test coverage report; create custom target `clio_tests-ccov`
'lint': [True, False], # run clang-tidy checks during compilation
'snapshot': [True, False], # build export/import snapshot tool
'time_trace': [True, False] # build using -ftime-trace to create compiler trace reports
}
requires = [
'boost/1.83.0',
'boost/1.82.0',
'cassandra-cpp-driver/2.17.0',
'fmt/10.1.1',
'protobuf/3.21.9',
'grpc/1.50.1',
'openssl/1.1.1v',
'xrpl/2.4.0',
'openssl/1.1.1u',
'xrpl/2.3.0-rc2',
'zlib/1.3.1',
'libbacktrace/cci.20210118'
]
@@ -47,9 +44,7 @@ class Clio(ConanFile):
'coverage': False,
'lint': False,
'docs': False,
'snapshot': False,
'time_trace': False,
'xrpl/*:tests': False,
'xrpl/*:rocksdb': False,
'cassandra-cpp-driver/*:shared': False,
@@ -81,12 +76,11 @@ class Clio(ConanFile):
def layout(self):
cmake_layout(self)
# Fix this setting to follow the default introduced in Conan 1.48
# Fix this setting to follow the default introduced in Conan 1.48
# to align with our build instructions.
self.folders.generators = 'build/generators'
generators = 'CMakeDeps'
def generate(self):
tc = CMakeToolchain(self)
tc.variables['verbose'] = self.options.verbose
@@ -98,8 +92,6 @@ class Clio(ConanFile):
tc.variables['docs'] = self.options.docs
tc.variables['packaging'] = self.options.packaging
tc.variables['benchmark'] = self.options.benchmark
tc.variables['snapshot'] = self.options.snapshot
tc.variables['time_trace'] = self.options.time_trace
tc.generate()
def build(self):

View File

@@ -4,14 +4,12 @@ This image contains an environment to build [Clio](https://github.com/XRPLF/clio
It is used in [Clio Github Actions](https://github.com/XRPLF/clio/actions) but can also be used to compile Clio locally.
The image is based on Ubuntu 20.04 and contains:
- clang 16.0.6
- clang 16
- gcc 12.3
- doxygen 1.12
- doxygen 1.10
- gh 2.40
- ccache 4.10.2
- conan 1.62
- ccache 4.8.3
- conan
- and some other useful tools
Conan is set up to build Clio without any additional steps. There are two preset conan profiles: `clang` and `gcc` to use corresponding compiler. By default conan is setup to use `gcc`.
Sanitizer builds for `ASAN`, `TSAN` and `UBSAN` are enabled via conan profiles for each of the supported compilers. These can be selected using the following pattern (all lowercase): `[compiler].[sanitizer]` (e.g. `--profile gcc.tsan`).
Conan is set up to build Clio without any additional steps. There are two preset conan profiles: `clang` and `gcc` to use corresponding compiler.

View File

@@ -1,9 +0,0 @@
include(clang)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=address\" linkflags=\"-fsanitize=address\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=address"
CXXFLAGS="-fsanitize=address"
LDFLAGS="-fsanitize=address"

View File

@@ -1,9 +0,0 @@
include(clang)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=thread\" linkflags=\"-fsanitize=thread\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=thread"
CXXFLAGS="-fsanitize=thread"
LDFLAGS="-fsanitize=thread"

View File

@@ -1,9 +0,0 @@
include(clang)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=undefined\" linkflags=\"-fsanitize=undefined\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=undefined"
CXXFLAGS="-fsanitize=undefined"
LDFLAGS="-fsanitize=undefined"

View File

@@ -1,9 +0,0 @@
include(gcc)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=address\" linkflags=\"-fsanitize=address\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=address"
CXXFLAGS="-fsanitize=address"
LDFLAGS="-fsanitize=address"

View File

@@ -1,9 +0,0 @@
include(gcc)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=thread\" linkflags=\"-fsanitize=thread\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=thread"
CXXFLAGS="-fsanitize=thread"
LDFLAGS="-fsanitize=thread"

View File

@@ -1,9 +0,0 @@
include(gcc)
[options]
boost:extra_b2_flags="cxxflags=\"-fsanitize=undefined\" linkflags=\"-fsanitize=undefined\""
boost:without_stacktrace=True
[env]
CFLAGS="-fsanitize=undefined"
CXXFLAGS="-fsanitize=undefined"
LDFLAGS="-fsanitize=undefined"

View File

@@ -9,9 +9,8 @@ WORKDIR /root
ENV CCACHE_VERSION=4.10.2 \
LLVM_TOOLS_VERSION=19 \
GH_VERSION=2.40.0 \
DOXYGEN_VERSION=1.12.0 \
CLANG_BUILD_ANALYZER_VERSION=1.6.0
DOXYGEN_VERSION=1.12.0
# Add repositories
RUN apt-get -qq update \
&& apt-get -qq install -y --no-install-recommends --no-install-suggests gnupg wget curl software-properties-common \
@@ -21,8 +20,9 @@ RUN apt-get -qq update \
# Install packages
RUN apt update -qq \
&& apt install -y --no-install-recommends --no-install-suggests python3 python3-pip git git-lfs make ninja-build flex bison jq graphviz \
clang-tidy-${LLVM_TOOLS_VERSION} clang-tools-${LLVM_TOOLS_VERSION} \
&& pip3 install -q --upgrade --no-cache-dir pip && pip3 install -q --no-cache-dir conan==1.62 gcovr cmake==3.31.6 pre-commit \
clang-format-${LLVM_TOOLS_VERSION} clang-tidy-${LLVM_TOOLS_VERSION} clang-tools-${LLVM_TOOLS_VERSION} \
&& update-alternatives --install /usr/bin/clang-format clang-format /usr/bin/clang-format-${LLVM_TOOLS_VERSION} 100 \
&& pip3 install -q --upgrade --no-cache-dir pip && pip3 install -q --no-cache-dir conan==1.62 gcovr cmake cmake-format \
&& apt-get clean && apt remove -y software-properties-common
# Install gcc-12 and make ldconfig aware of the new libstdc++ location (for gcc)
@@ -62,14 +62,8 @@ RUN wget "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN
&& cmake --build . --target install \
&& rm -rf /tmp/* /var/tmp/*
# Install ClangBuildAnalyzer
RUN wget "https://github.com/aras-p/ClangBuildAnalyzer/releases/download/v${CLANG_BUILD_ANALYZER_VERSION}/ClangBuildAnalyzer-linux" \
&& chmod +x ClangBuildAnalyzer-linux \
&& mv ClangBuildAnalyzer-linux /usr/bin/ClangBuildAnalyzer \
&& rm -rf /tmp/* /var/tmp/*
# Install gh
RUN wget "https://github.com/cli/cli/releases/download/v${GH_VERSION}/gh_${GH_VERSION}_linux_${TARGETARCH}.tar.gz" \
RUN wget https://github.com/cli/cli/releases/download/v${GH_VERSION}/gh_${GH_VERSION}_linux_${TARGETARCH}.tar.gz \
&& tar xf gh_${GH_VERSION}_linux_${TARGETARCH}.tar.gz \
&& mv gh_${GH_VERSION}_linux_${TARGETARCH}/bin/gh /usr/bin/gh \
&& rm -rf /tmp/* /var/tmp/*
@@ -78,7 +72,7 @@ WORKDIR /root
# Using root by default is not very secure but github checkout action doesn't work with any other user
# https://github.com/actions/checkout/issues/956
# And Github Actions doc recommends using root
# https://docs.github.com/en/actions/sharing-automations/creating-actions/dockerfile-support-for-github-actions#user
# https://docs.github.com/en/actions/creating-actions/dockerfile-support-for-github-actions#user
# Setup conan
RUN conan remote add --insert 0 conan-non-prod http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
@@ -101,13 +95,6 @@ RUN conan profile new clang --detect \
&& conan profile update env.CC=/usr/bin/clang-16 clang \
&& conan profile update env.CXX=/usr/bin/clang++-16 clang \
&& conan profile update env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS" clang \
&& conan profile update "conf.tools.build:compiler_executables={\"c\": \"/usr/bin/clang-16\", \"cpp\": \"/usr/bin/clang++-16\"}" clang
&& conan profile update "conf.tools.build:compiler_executables={\"c\": \"/usr/bin/clang-16\", \"cpp\": \"/usr/bin/clang++-16\"}" clang
RUN echo "include(gcc)" >> .conan/profiles/default
COPY conan/gcc.asan /root/.conan/profiles
COPY conan/gcc.tsan /root/.conan/profiles
COPY conan/gcc.ubsan /root/.conan/profiles
COPY conan/clang.asan /root/.conan/profiles
COPY conan/clang.tsan /root/.conan/profiles
COPY conan/clang.ubsan /root/.conan/profiles

View File

@@ -12,14 +12,12 @@ Your configuration file should be mounted under the path `/opt/clio/etc/config.j
Clio repository provides an [example](https://github.com/XRPLF/clio/blob/develop/docs/examples/config/example-config.json) of the configuration file.
Config file recommendations:
- Set `log_to_console` to `false` if you want to avoid logs being written to `stdout`.
- Set `log_directory` to `/opt/clio/log` to store logs in a volume.
## Usage
The following command can be used to run Clio in docker (assuming server's port is `51233` in your config):
```bash
docker run -d -v <path to your config.json>:/opt/clio/etc/config.json -v <path to store logs>:/opt/clio/log -p 51233:51233 rippleci/clio
```

View File

@@ -1,4 +1,4 @@
FROM ubuntu:focal
FROM ubuntu:focal
ARG DEBIAN_FRONTEND=noninteractive
ARG TARGETARCH

View File

@@ -63,7 +63,7 @@ COPY --from=build /gcc12.deb /
# Make gcc-12 available but also leave gcc12.deb for others to copy if needed
RUN apt update && apt-get install -y binutils libc6-dev \
&& dpkg -i /gcc12.deb
&& dpkg -i /gcc12.deb
RUN update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-12 100 \
&& update-alternatives --install /usr/bin/c++ c++ /usr/bin/g++-12 100 \

View File

@@ -1,6 +1,6 @@
services:
clio_develop:
image: ghcr.io/xrplf/clio-ci:latest
image: rippleci/clio_ci:latest
volumes:
- clio_develop_conan_data:/root/.conan/data
- clio_develop_ccache:/root/.ccache

View File

@@ -59,3 +59,4 @@ case $1 in
esac
popd > /dev/null

View File

@@ -3,8 +3,6 @@ project(docs)
include(${CMAKE_CURRENT_SOURCE_DIR}/../cmake/ClioVersion.cmake)
# cmake-format: off
# Generate `docs` target for doxygen documentation
# Note: use `cmake --build . --target docs` from your `build` directory to generate the documentation
# cmake-format: on
include(${CMAKE_CURRENT_SOURCE_DIR}/../cmake/Docs.cmake)

View File

@@ -34,7 +34,7 @@ FULL_SIDEBAR = NO
HTML_HEADER = ${SOURCE}/docs/doxygen-awesome-theme/header.html
HTML_EXTRA_STYLESHEET = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only.css \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only-darkmode-toggle.css
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only-darkmode-toggle.css
HTML_EXTRA_FILES = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-darkmode-toggle.js \
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-interactive-toc.js

View File

@@ -11,7 +11,7 @@ Clio is built with [CMake](https://cmake.org/) and uses [Conan](https://conan.io
- [**Optional**] [CCache](https://ccache.dev/): speeds up compilation if you are going to compile Clio often
| Compiler | Version |
| ----------- | ------- |
|-------------|---------|
| GCC | 12.3 |
| Clang | 16 |
| Apple Clang | 15 |
@@ -23,9 +23,9 @@ Clio does not require anything other than `compiler.cppstd=20` in your (`~/.cona
> [!NOTE]
> Although Clio is built using C++23, it's required to set `compiler.cppstd=20` for the time being as some of Clio's dependencies are not yet capable of building under C++23.
**Mac example**:
> Mac example:
```text
```
[settings]
os=Macos
os_build=Macos
@@ -40,9 +40,9 @@ compiler.cppstd=20
tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]
```
**Linux example**:
> Linux example:
```text
```
[settings]
os=Linux
os_build=Linux
@@ -93,8 +93,6 @@ If successful, `conan install` will find the required packages and `cmake` will
> [!TIP]
> To generate a Code Coverage report, include `-o coverage=True` in the `conan install` command above, along with `-o tests=True` to enable tests. After running the `cmake` commands, execute `make clio_tests-ccov`. The coverage report will be found at `clio_tests-llvm-cov/index.html`.
<!-- markdownlint-disable-line MD028 -->
> [!NOTE]
> If you've built Clio before and the build is now failing, it's likely due to updated dependencies. Try deleting the build folder and then rerunning the Conan and CMake commands mentioned above.
@@ -106,10 +104,10 @@ To generate the API docs:
1. First, include `-o docs=True` in the conan install command. For example:
```sh
mkdir build && cd build
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=False -o docs=True
```
```sh
mkdir build && cd build
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=False -o docs=True
```
2. Once that has completed successfully, run the `cmake` command and add the `--target docs` option:
@@ -120,16 +118,16 @@ To generate the API docs:
3. Go to `build/docs/html` to view the generated files.
Open the `index.html` file in your browser to see the documentation pages.
Open the `index.html` file in your browser to see the documentation pages.
![API index page](./img/doxygen-docs-output.png "API index page")
![API index page](./img/doxygen-docs-output.png "API index page")
## Building Clio with Docker
It is also possible to build Clio using [Docker](https://www.docker.com/) if you don't want to install all the dependencies on your machine.
```sh
docker run -it ghcr.io/xrplf/clio-ci:latest
docker run -it rippleci/clio_ci:latest
git clone https://github.com/XRPLF/clio
mkdir build && cd build
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=False
@@ -150,53 +148,36 @@ Sometimes, during development, you need to build against a custom version of `li
1. First, pull/clone the appropriate `rippled` fork and switch to the branch you want to build. For example, the following example uses an in-development build with [XLS-33d Multi-Purpose Tokens](https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0033d-multi-purpose-tokens):
```sh
git clone https://github.com/shawnxie999/rippled/
cd rippled
git switch mpt-1.1
```
```sh
git clone https://github.com/shawnxie999/rippled/
cd rippled
git switch mpt-1.1
```
2. Export a custom package to your local Conan store using a user/channel:
```sh
conan export . my/feature
```
```sh
conan export . my/feature
```
3. Patch your local Clio build to use the right package.
Edit `conanfile.py` (from the Clio repository root). Replace the `xrpl` requirement with the custom package version from the previous step. This must also include the current version number from your `rippled` branch. For example:
Edit `conanfile.py` (from the Clio repository root). Replace the `xrpl` requirement with the custom package version from the previous step. This must also include the current version number from your `rippled` branch. For example:
```py
# ... (excerpt from conanfile.py)
requires = [
'boost/1.82.0',
'cassandra-cpp-driver/2.17.0',
'fmt/10.1.1',
'protobuf/3.21.9',
'grpc/1.50.1',
'openssl/1.1.1u',
'xrpl/2.3.0-b1@my/feature', # Update this line
'libbacktrace/cci.20210118'
]
```
```py
# ... (excerpt from conanfile.py)
requires = [
'boost/1.82.0',
'cassandra-cpp-driver/2.17.0',
'fmt/10.1.1',
'protobuf/3.21.9',
'grpc/1.50.1',
'openssl/1.1.1u',
'xrpl/2.3.0-b1@my/feature', # Update this line
'libbacktrace/cci.20210118'
]
```
4. Build Clio as you would have before.
See [Building Clio](#building-clio) for details.
## Using `clang-tidy` for static analysis
The minimum [clang-tidy](https://clang.llvm.org/extra/clang-tidy/) version required is 19.0.
Clang-tidy can be run by Cmake when building the project. To achieve this, you just need to provide the option `-o lint=True` for the `conan install` command:
```sh
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=True
```
By default Cmake will try to find `clang-tidy` automatically in your system.
To force Cmake to use your desired binary, set the `CLIO_CLANG_TIDY_BIN` environment variable to the path of the `clang-tidy` binary. For example:
```sh
export CLIO_CLANG_TIDY_BIN=/opt/homebrew/opt/llvm@19/bin/clang-tidy
```
See [Building Clio](#building-clio) for details.

View File

@@ -1,592 +0,0 @@
# Clio Config Description
This document provides a list of all available Clio configuration properties in detail.
> [!NOTE]
> Dot notation in configuration key names represents nested fields. For example, **database.scylladb** refers to the _scylladb_ field inside the _database_ object. If a key name includes "[]", it indicates that the nested field is an array (e.g., etl_sources.[]).
## Configuration Details
### database.type
- **Required**: True
- **Type**: string
- **Default value**: `cassandra`
- **Constraints**: The value must be one of the following: `cassandra`.
- **Description**: Specifies the type of database used for storing and retrieving data required by the Clio server. Both ScyllaDB and Cassandra can serve as backends for Clio; however, this value must be set to `cassandra`.
### database.cassandra.contact_points
- **Required**: True
- **Type**: string
- **Default value**: `localhost`
- **Constraints**: None
- **Description**: A list of IP addresses or hostnames for the initial cluster nodes (Cassandra or ScyllaDB) that the client connects to when establishing a database connection. If you're running Clio locally, set this value to `localhost` or `127.0.0.1`.
### database.cassandra.secure_connect_bundle
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The configuration file that contains the necessary credentials and connection details for securely connecting to a Cassandra database cluster.
### database.cassandra.port
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The port number used to connect to the Cassandra database.
### database.cassandra.keyspace
- **Required**: True
- **Type**: string
- **Default value**: `clio`
- **Constraints**: None
- **Description**: The Cassandra keyspace to use for the database. If you don't provide a value, this is set to `clio` by default.
### database.cassandra.replication_factor
- **Required**: True
- **Type**: int
- **Default value**: `3`
- **Constraints**: The minimum value is `0`. The maximum value is `65535`.
- **Description**: Represents the number of replicated nodes for ScyllaDB. For more details see [Fault Tolerance Replication Factor](https://university.scylladb.com/courses/scylla-essentials-overview/lessons/high-availability/topic/fault-tolerance-replication-factor/).
### database.cassandra.table_prefix
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: An optional field to specify a prefix for the Cassandra database table names.
### database.cassandra.max_write_requests_outstanding
- **Required**: True
- **Type**: int
- **Default value**: `10000`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Represents the maximum number of outstanding write requests. Write requests are API calls that write to the database.
### database.cassandra.max_read_requests_outstanding
- **Required**: True
- **Type**: int
- **Default value**: `100000`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Maximum number of outstanding read requests. Read requests are API calls that read from the database.
### database.cassandra.threads
- **Required**: True
- **Type**: int
- **Default value**: The number of available CPU cores.
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Represents the number of threads that will be used for database operations.
### database.cassandra.core_connections_per_host
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The number of core connections per host for the Cassandra database.
### database.cassandra.queue_size_io
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: Defines the queue size of the input/output (I/O) operations in Cassandra.
### database.cassandra.write_batch_size
- **Required**: True
- **Type**: int
- **Default value**: `20`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: Represents the batch size for write operations in Cassandra.
### database.cassandra.connect_timeout
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum amount of time in seconds that the system waits for a database connection to be established.
### database.cassandra.request_timeout
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum amount of time in seconds that the system waits for a request to be fetched from the database.
### database.cassandra.username
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The username used for authenticating with the database.
### database.cassandra.password
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The password used for authenticating with the database.
### database.cassandra.certfile
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The path to the SSL/TLS certificate file used to establish a secure connection between the client and the Cassandra database.
### allow_no_etl
- **Required**: True
- **Type**: boolean
- **Default value**: `True`
- **Constraints**: None
- **Description**: If set to `True`, allows Clio to start without any ETL source.
### etl_sources.[].ip
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: The value must be a valid IP address.
- **Description**: The IP address of the ETL source.
### etl_sources.[].ws_port
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The WebSocket port of the ETL source.
### etl_sources.[].grpc_port
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The gRPC port of the ETL source.
### forwarding.cache_timeout
- **Required**: True
- **Type**: double
- **Default value**: `0`
- **Constraints**: The value must be a positive double number.
- **Description**: Specifies the timeout duration (in seconds) for the forwarding cache used in `rippled` communication. A value of `0` means disabling this feature.
### forwarding.request_timeout
- **Required**: True
- **Type**: double
- **Default value**: `10`
- **Constraints**: The value must be a positive double number.
- **Description**: Specifies the timeout duration (in seconds) for the forwarding request used in `rippled` communication.
### rpc.cache_timeout
- **Required**: True
- **Type**: double
- **Default value**: `0`
- **Constraints**: The value must be a positive double number.
- **Description**: Specifies the timeout duration (in seconds) for RPC cache response to timeout. A value of `0` means disabling this feature.
### num_markers
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `256`.
- **Description**: Specifies the number of coroutines used to download the initial ledger.
### dos_guard.whitelist.[]
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The list of IP addresses to whitelist for DOS protection.
### dos_guard.max_fetches
- **Required**: True
- **Type**: int
- **Default value**: `1000000`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum number of fetch operations allowed by DOS guard.
### dos_guard.max_connections
- **Required**: True
- **Type**: int
- **Default value**: `20`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum number of concurrent connections for a specific IP address.
### dos_guard.max_requests
- **Required**: True
- **Type**: int
- **Default value**: `20`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum number of requests allowed for a specific IP address.
### dos_guard.sweep_interval
- **Required**: True
- **Type**: double
- **Default value**: `1`
- **Constraints**: The value must be a positive double number.
- **Description**: Interval in seconds for DOS guard to sweep(clear) its state.
### workers
- **Required**: True
- **Type**: int
- **Default value**: The number of available CPU cores.
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The number of threads used to process RPC requests.
### server.ip
- **Required**: True
- **Type**: string
- **Default value**: None
- **Constraints**: The value must be a valid IP address.
- **Description**: The IP address of the Clio HTTP server.
### server.port
- **Required**: True
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The port number of the Clio HTTP server.
### server.max_queue_size
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum size of the server's request queue. If set to `0`, this means there is no queue size limit.
### server.local_admin
- **Required**: False
- **Type**: boolean
- **Default value**: None
- **Constraints**: None
- **Description**: Indicates if requests from `localhost` are allowed to call Clio admin-only APIs. Note that this setting cannot be enabled together with [server.admin_password](#serveradmin_password).
### server.admin_password
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The password for Clio admin-only APIs. Note that this setting cannot be enabled together with [server.local_admin](#serveradmin_password).
### server.processing_policy
- **Required**: True
- **Type**: string
- **Default value**: `parallel`
- **Constraints**: The value must be one of the following: `parallel`, `sequent`.
- **Description**: For the `sequent` policy, requests from a single client connection are processed one by one, with the next request read only after the previous one is processed. For the `parallel` policy, Clio will accept all requests and process them in parallel, sending a reply for each request as soon as it is ready.
### server.parallel_requests_limit
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: This is an optional parameter, used only if the `processing_strategy` is `parallel`. It limits the number of requests processed in parallel for a single client connection. If not specified, no limit is enforced.
### server.ws_max_sending_queue_size
- **Required**: True
- **Type**: int
- **Default value**: `1500`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Maximum queue size for sending subscription data to clients. This queue buffers data when a client is slow to receive it, ensuring delivery once the client is ready.
### prometheus.enabled
- **Required**: True
- **Type**: boolean
- **Default value**: `False`
- **Constraints**: None
- **Description**: Enables or disables Prometheus metrics.
### prometheus.compress_reply
- **Required**: True
- **Type**: boolean
- **Default value**: `False`
- **Constraints**: None
- **Description**: Enables or disables compression of Prometheus responses.
### io_threads
- **Required**: True
- **Type**: int
- **Default value**: `2`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The number of input/output (I/O) threads. The value cannot be less than `1`.
### subscription_workers
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The number of worker threads or processes that are responsible for managing and processing subscription-based tasks from `rippled`.
### graceful_period
- **Required**: True
- **Type**: double
- **Default value**: `10`
- **Constraints**: The value must be a positive double number.
- **Description**: The number of milliseconds the server waits to shutdown gracefully. If Clio does not shutdown gracefully after the specified value, it will be killed instead.
### cache.num_diffs
- **Required**: True
- **Type**: int
- **Default value**: `32`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The number of cursors generated is the number of changed (without counting deleted) objects in the latest `cache.num_diffs` number of ledgers. Cursors are workers that load the ledger cache from the position of markers concurrently. For more information, please read [README.md](../src/etl/README.md).
### cache.num_markers
- **Required**: True
- **Type**: int
- **Default value**: `48`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: Specifies how many markers are placed randomly within the cache. These markers define the positions on the ledger that will be loaded concurrently by the workers. The higher the number, the more places within the cache we potentially cover.
### cache.num_cursors_from_diff
- **Required**: True
- **Type**: int
- **Default value**: `0`
- **Constraints**: The minimum value is `0`. The maximum value is `65535`.
- **Description**: `cache.num_cursors_from_diff` number of cursors are generated by looking at the number of changed objects in the most recent ledger. If number of changed objects in current ledger is not enough, it will keep reading previous ledgers until it hit `cache.num_cursors_from_diff`. If set to `0`, the system defaults to generating cursors based on `cache.num_diffs`.
### cache.num_cursors_from_account
- **Required**: True
- **Type**: int
- **Default value**: `0`
- **Constraints**: The minimum value is `0`. The maximum value is `65535`.
- **Description**: `cache.num_cursors_from_diff` of cursors are generated by reading accounts in `account_tx` table. If set to `0`, the system defaults to generating cursors based on `cache.num_diffs`.
### cache.page_fetch_size
- **Required**: True
- **Type**: int
- **Default value**: `512`
- **Constraints**: The minimum value is `1`. The maximum value is `65535`.
- **Description**: The number of ledger objects to fetch concurrently per marker.
### cache.load
- **Required**: True
- **Type**: string
- **Default value**: `async`
- **Constraints**: The value must be one of the following: `sync`, `async`, `none`.
- **Description**: The strategy used for Cache loading.
### log_channels.[].channel
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: The value must be one of the following: `General`, `WebServer`, `Backend`, `RPC`, `ETL`, `Subscriptions`, `Performance`, `Migration`.
- **Description**: The name of the log channel.
### log_channels.[].log_level
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: The value must be one of the following: `trace`, `debug`, `info`, `warning`, `error`, `fatal`, `count`.
- **Description**: The log level for the specific log channel.
### log_level
- **Required**: True
- **Type**: string
- **Default value**: `info`
- **Constraints**: The value must be one of the following: `trace`, `debug`, `info`, `warning`, `error`, `fatal`, `count`.
- **Description**: The general logging level of Clio. This level is applied to all log channels that do not have an explicitly defined logging level.
### log_format
- **Required**: True
- **Type**: string
- **Default value**: `%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%`
- **Constraints**: None
- **Description**: The format string for log messages. The format is described here: <https://www.boost.org/doc/libs/1_83_0/libs/log/doc/html/log/tutorial/formatters.html>.
### log_to_console
- **Required**: True
- **Type**: boolean
- **Default value**: `True`
- **Constraints**: None
- **Description**: Enables or disables logging to the console.
### log_directory
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The directory path for the log files.
### log_rotation_size
- **Required**: True
- **Type**: int
- **Default value**: `2048`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The log rotation size in megabytes. When the log file reaches this particular size, a new log file starts.
### log_directory_max_size
- **Required**: True
- **Type**: int
- **Default value**: `51200`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The maximum size of the log directory in megabytes.
### log_rotation_hour_interval
- **Required**: True
- **Type**: int
- **Default value**: `12`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Represents the interval (in hours) for log rotation. If the current log file reaches this value in logging, a new log file starts.
### log_tag_style
- **Required**: True
- **Type**: string
- **Default value**: `none`
- **Constraints**: The value must be one of the following: `int`, `uint`, `null`, `none`, `uuid`.
- **Description**: Log tags are unique identifiers for log messages. `uint`/`int` starts logging from 0 and increments, making it faster. In contrast, `uuid` generates a random unique identifier, which adds overhead.
### extractor_threads
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: Number of threads used to extract data from ETL source.
### read_only
- **Required**: True
- **Type**: boolean
- **Default value**: `True`
- **Constraints**: None
- **Description**: Indicates if the server is allowed to write data to the database.
### start_sequence
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: If specified, the ledger index Clio will start writing to the database from.
### finish_sequence
- **Required**: False
- **Type**: int
- **Default value**: None
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: If specified, the final ledger that Clio will write to the database.
### ssl_cert_file
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The path to the SSL certificate file.
### ssl_key_file
- **Required**: False
- **Type**: string
- **Default value**: None
- **Constraints**: None
- **Description**: The path to the SSL key file.
### api_version.default
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `3`.
- **Description**: The default API version that the Clio server will run on.
### api_version.min
- **Required**: True
- **Type**: int
- **Default value**: `1`
- **Constraints**: The minimum value is `1`. The maximum value is `3`.
- **Description**: The minimum API version allowed to use.
### api_version.max
- **Required**: True
- **Type**: int
- **Default value**: `3`
- **Constraints**: The minimum value is `1`. The maximum value is `3`.
- **Description**: The maximum API version allowed to use.
### migration.full_scan_threads
- **Required**: True
- **Type**: int
- **Default value**: `2`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The number of threads used to scan the table.
### migration.full_scan_jobs
- **Required**: True
- **Type**: int
- **Default value**: `4`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The number of coroutines used to scan the table.
### migration.cursors_per_job
- **Required**: True
- **Type**: int
- **Default value**: `100`
- **Constraints**: The minimum value is `1`. The maximum value is `4294967295`.
- **Description**: The number of cursors each job will scan.

View File

@@ -18,8 +18,7 @@ Clio needs access to a `rippled` server in order to work. The following configur
- A port to handle gRPC requests, with the IP(s) of Clio specified in the `secure_gateway` entry
The example configs of [rippled](https://github.com/XRPLF/rippled/blob/develop/cfg/rippled-example.cfg) and [Clio](../docs/examples/config/example-config.json) are set up in a way that minimal changes are required. However, if you want to view all configuration keys available in Clio, see [config-description.md](./config-description.md).
The example configs of [rippled](https://github.com/XRPLF/rippled/blob/develop/cfg/rippled-example.cfg) and [Clio](../docs/examples/config/example-config.json) are set up in a way that minimal changes are required.
When running locally, the only change needed is to uncomment the `port_grpc` section of the `rippled` config.
If you're running Clio and `rippled` on separate machines, in addition to uncommenting the `port_grpc` section, a few other steps must be taken:
@@ -53,7 +52,7 @@ Here is an example snippet from the config file:
## SSL
The parameters `ssl_cert_file` and `ssl_key_file` can also be added to the top level of precedence of our Clio config. The `ssl_cert_file` field specifies the filepath for your SSL cert, while `ssl_key_file` specifies the filepath for your SSL key. It is up to you how to change ownership of these folders for your designated Clio user.
The parameters `ssl_cert_file` and `ssl_key_file` can also be added to the top level of precedence of our Clio config. The `ssl_cert_file` field specifies the filepath for your SSL cert, while `ssl_key_file` specifies the filepath for your SSL key. It is up to you how to change ownership of these folders for your designated Clio user.
Your options include:

View File

@@ -0,0 +1,43 @@
/*
* This is an example configuration file. Please do not use without modifying to suit your needs.
*/
{
"database": {
"type": "cassandra",
"cassandra": {
// This option can be used to setup a secure connect bundle connection
"secure_connect_bundle": "[path/to/zip. ignore if using contact_points]",
// The following options are used only if using contact_points
"contact_points": "[ip. ignore if using secure_connect_bundle]",
"port": "[port. ignore if using_secure_connect_bundle]",
// Authentication settings
"username": "[username, if any]",
"password": "[password, if any]",
// Other common settings
"keyspace": "clio",
"max_write_requests_outstanding": 25000,
"max_read_requests_outstanding": 30000,
"threads": 8
}
},
"etl_sources": [
{
"ip": "[rippled ip]",
"ws_port": "6006",
"grpc_port": "50051"
}
],
"dos_guard": {
"whitelist": [
"127.0.0.1"
]
},
"server": {
"ip": "0.0.0.0",
"port": 8080
},
"log_level": "debug",
"log_file": "./clio.log",
"extractor_threads": 8,
"read_only": false
}

View File

@@ -2,144 +2,145 @@
* This is an example configuration file. Please do not use without modifying to suit your needs.
*/
{
"database": {
"type": "cassandra",
"cassandra": {
"contact_points": "127.0.0.1",
"port": 9042,
"keyspace": "clio",
"replication_factor": 1,
"table_prefix": "",
"max_write_requests_outstanding": 25000,
"max_read_requests_outstanding": 30000,
"threads": 8,
//
// Advanced options. USE AT OWN RISK:
// ---
"core_connections_per_host": 1, // Defaults to 1
"write_batch_size": 20 // Defaults to 20
//
// Below options will use defaults from cassandra driver if left unspecified.
// See https://docs.datastax.com/en/developer/cpp-driver/2.17/api/struct.CassCluster/ for details.
//
// "queue_size_io": 2
//
// ---
"database": {
"type": "cassandra",
"cassandra": {
"contact_points": "127.0.0.1",
"port": 9042,
"keyspace": "clio",
"replication_factor": 1,
"table_prefix": "",
"max_write_requests_outstanding": 25000,
"max_read_requests_outstanding": 30000,
"threads": 8,
//
// Advanced options. USE AT OWN RISK:
// ---
"core_connections_per_host": 1, // Defaults to 1
"write_batch_size": 20 // Defaults to 20
//
// Below options will use defaults from cassandra driver if left unspecified.
// See https://docs.datastax.com/en/developer/cpp-driver/2.17/api/struct.CassCluster/ for details.
//
// "queue_size_io": 2
//
// ---
}
},
"allow_no_etl": false, // Allow Clio to run without valid ETL source, otherwise Clio will stop if ETL check fails
"etl_sources": [
{
"ip": "127.0.0.1",
"ws_port": "6005",
"grpc_port": "50051"
}
],
"forwarding": {
"cache_timeout": 0.250, // in seconds, could be 0, which means no cache
"request_timeout": 10.0 // time for Clio to wait for rippled to reply on a forwarded request (default is 10 seconds)
},
"rpc": {
"cache_timeout": 0.5 // in seconds, could be 0, which means no cache for rpc
},
"dos_guard": {
// Comma-separated list of IPs to exclude from rate limiting
"whitelist": [
"127.0.0.1"
],
//
// The below values are the default values and are only specified here
// for documentation purposes. The rate limiter currently limits
// connections and bandwidth per IP. The rate limiter looks at the raw
// IP of a client connection, and so requests routed through a load
// balancer will all have the same IP and be treated as a single client.
//
"max_fetches": 1000000, // Max bytes per IP per sweep interval
"max_connections": 20, // Max connections per IP
"max_requests": 20, // Max connections per IP per sweep interval
"sweep_interval": 1 // Time in seconds before resetting max_fetches and max_requests
},
"server": {
"ip": "0.0.0.0",
"port": 51233,
// Max number of requests to queue up before rejecting further requests.
// Defaults to 0, which disables the limit.
"max_queue_size": 500,
// If request contains header with authorization, Clio will check if it matches the prefix 'Password ' + this value's sha256 hash
// If matches, the request will be considered as admin request
"admin_password": "xrp",
// If local_admin is true, Clio will consider requests come from 127.0.0.1 as admin requests
// It's true by default unless admin_password is set,'local_admin' : true and 'admin_password' can not be set at the same time
"local_admin": false,
"processing_policy": "parallel", // Could be "sequent" or "parallel".
// For sequent policy request from one client connection will be processed one by one and the next one will not be read before
// the previous one is processed. For parallel policy Clio will take all requests and process them in parallel and
// send a reply for each request whenever it is ready.
"parallel_requests_limit": 10, // Optional parameter, used only if "processing_strategy" is "parallel". It limits the number of requests for one client connection processed in parallel. Infinite if not specified.
// Max number of responses to queue up before sent successfully. If a client's waiting queue is too long, the server will close the connection.
"ws_max_sending_queue_size": 1500
},
// Time in seconds for graceful shutdown. Defaults to 10 seconds. Not fully implemented yet.
"graceful_period": 10.0,
// Overrides log level on a per logging channel.
// Defaults to global "log_level" for each unspecified channel.
"log_channels": [
{
"channel": "Backend",
"log_level": "fatal"
},
{
"channel": "WebServer",
"log_level": "info"
},
{
"channel": "Subscriptions",
"log_level": "info"
},
{
"channel": "RPC",
"log_level": "error"
},
{
"channel": "ETL",
"log_level": "debug"
},
{
"channel": "Performance",
"log_level": "trace"
}
],
"cache": {
// Configure this to use either "num_diffs", "num_cursors_from_diff", or "num_cursors_from_account". By default, Clio uses "num_diffs".
"num_diffs": 32, // Generate the cursors from the latest ledger diff, then use the cursors to partition the ledger to load concurrently. The cursors number is affected by the busyness of the network.
// "num_cursors_from_diff": 3200, // Read the cursors from the diff table until we have enough cursors to partition the ledger to load concurrently.
// "num_cursors_from_account": 3200, // Read the cursors from the account table until we have enough cursors to partition the ledger to load concurrently.
"num_markers": 48, // The number of markers is the number of coroutines to load the cache concurrently.
"page_fetch_size": 512, // The number of rows to load for each page.
"load": "async" // "sync" to load cache synchronously or "async" to load cache asynchronously or "none"/"no" to turn off the cache.
},
"prometheus": {
"enabled": true,
"compress_reply": true
},
"log_level": "info",
// Log format (this is the default format)
"log_format": "%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%",
"log_to_console": true,
// Clio logs to file in the specified directory only if "log_directory" is set
// "log_directory": "./clio_log",
"log_rotation_size": 2048,
"log_directory_max_size": 51200,
"log_rotation_hour_interval": 12,
"log_tag_style": "uint",
"extractor_threads": 8,
"read_only": false,
// "start_sequence": [integer] the ledger index to start from,
// "finish_sequence": [integer] the ledger index to finish at,
// "ssl_cert_file" : "/full/path/to/cert.file",
// "ssl_key_file" : "/full/path/to/key.file"
"api_version": {
"min": 1, // Minimum API version supported (could be 1 or 2)
"max": 2, // Maximum API version supported (could be 1 or 2, but >= min)
"default": 1 // Clio behaves the same as rippled by default
}
},
"allow_no_etl": false, // Allow Clio to run without valid ETL source, otherwise Clio will stop if ETL check fails
"etl_sources": [
{
"ip": "127.0.0.1",
"ws_port": "6005",
"grpc_port": "50051"
}
],
"forwarding": {
"cache_timeout": 0.25, // in seconds, could be 0, which means no cache
"request_timeout": 10.0 // time for Clio to wait for rippled to reply on a forwarded request (default is 10 seconds)
},
"rpc": {
"cache_timeout": 0.5 // in seconds, could be 0, which means no cache for rpc
},
"dos_guard": {
// Comma-separated list of IPs to exclude from rate limiting
"whitelist": ["127.0.0.1"],
//
// The below values are the default values and are only specified here
// for documentation purposes. The rate limiter currently limits
// connections and bandwidth per IP. The rate limiter looks at the raw
// IP of a client connection, and so requests routed through a load
// balancer will all have the same IP and be treated as a single client.
//
"max_fetches": 1000000, // Max bytes per IP per sweep interval
"max_connections": 20, // Max connections per IP
"max_requests": 20, // Max connections per IP per sweep interval
"sweep_interval": 1 // Time in seconds before resetting max_fetches and max_requests
},
"server": {
"ip": "0.0.0.0",
"port": 51233,
// Max number of requests to queue up before rejecting further requests.
// Defaults to 0, which disables the limit.
"max_queue_size": 500,
// If request contains header with authorization, Clio will check if it matches the prefix 'Password ' + this value's sha256 hash
// If matches, the request will be considered as admin request
"admin_password": "xrp",
// If local_admin is true, Clio will consider requests come from 127.0.0.1 as admin requests
// It's true by default unless admin_password is set,'local_admin' : true and 'admin_password' can not be set at the same time
"local_admin": false,
"processing_policy": "parallel", // Could be "sequent" or "parallel".
// For sequent policy request from one client connection will be processed one by one and the next one will not be read before
// the previous one is processed. For parallel policy Clio will take all requests and process them in parallel and
// send a reply for each request whenever it is ready.
"parallel_requests_limit": 10, // Optional parameter, used only if "processing_strategy" is "parallel". It limits the number of requests for one client connection processed in parallel. Infinite if not specified.
// Max number of responses to queue up before sent successfully. If a client's waiting queue is too long, the server will close the connection.
"ws_max_sending_queue_size": 1500,
"__ng_web_server": false // Use ng web server. This is a temporary setting which will be deleted after switching to ng web server
},
// Time in seconds for graceful shutdown. Defaults to 10 seconds. Not fully implemented yet.
"graceful_period": 10.0,
// Overrides log level on a per logging channel.
// Defaults to global "log_level" for each unspecified channel.
"log_channels": [
{
"channel": "Backend",
"log_level": "fatal"
},
{
"channel": "WebServer",
"log_level": "info"
},
{
"channel": "Subscriptions",
"log_level": "info"
},
{
"channel": "RPC",
"log_level": "error"
},
{
"channel": "ETL",
"log_level": "debug"
},
{
"channel": "Performance",
"log_level": "trace"
}
],
"cache": {
// Configure this to use either "num_diffs", "num_cursors_from_diff", or "num_cursors_from_account". By default, Clio uses "num_diffs".
"num_diffs": 32, // Generate the cursors from the latest ledger diff, then use the cursors to partition the ledger to load concurrently. The cursors number is affected by the busyness of the network.
// "num_cursors_from_diff": 3200, // Read the cursors from the diff table until we have enough cursors to partition the ledger to load concurrently.
// "num_cursors_from_account": 3200, // Read the cursors from the account table until we have enough cursors to partition the ledger to load concurrently.
"num_markers": 48, // The number of markers is the number of coroutines to load the cache concurrently.
"page_fetch_size": 512, // The number of rows to load for each page.
"load": "async" // "sync" to load cache synchronously or "async" to load cache asynchronously or "none"/"no" to turn off the cache.
},
"prometheus": {
"enabled": true,
"compress_reply": true
},
"log_level": "info",
// Log format (this is the default format)
"log_format": "%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%",
"log_to_console": true,
// Clio logs to file in the specified directory only if "log_directory" is set
// "log_directory": "./clio_log",
"log_rotation_size": 2048,
"log_directory_max_size": 51200,
"log_rotation_hour_interval": 12,
"log_tag_style": "uint",
"extractor_threads": 8,
"read_only": false,
// "start_sequence": [integer] the ledger index to start from,
// "finish_sequence": [integer] the ledger index to finish at,
// "ssl_cert_file" : "/full/path/to/cert.file",
// "ssl_key_file" : "/full/path/to/key.file"
"api_version": {
"min": 1, // Minimum API version supported (could be 1 or 2)
"max": 2, // Maximum API version supported (could be 1 or 2, but >= min)
"default": 1 // Clio behaves the same as rippled by default
}
}

View File

@@ -1,15 +1,10 @@
# Example of clio monitoring infrastructure
> [!WARNING]
> This is only an example of Grafana dashboard for Clio. It was created for demonstration purposes only and may contain errors.
> Clio team would not recommend to relate on data from this dashboard or use it for monitoring your Clio instances.
This directory contains an example of docker based infrastructure to collect and visualise metrics from clio.
The structure of the directory:
- `compose.yaml`
Docker-compose file with Prometheus and Grafana set up.
Docker-compose file with Prometheus and Grafana set up.
- `prometheus.yaml`
Defines metrics collection from Clio and Prometheus itself.
Demonstrates how to setup Clio target and Clio's admin authorisation in Prometheus.
@@ -25,6 +20,6 @@ The structure of the directory:
1. Make sure you have `docker` and `docker-compose` installed.
2. Run `docker-compose up -d` from this directory. It will start docker containers with Prometheus and Grafana.
3. Open [http://localhost:3000/dashboards](http://localhost:3000/dashboards). Grafana login `admin`, password `grafana`.
There will be preconfigured Clio dashboard.
There will be preconfigured Clio dashboard.
If Clio is not running yet launch Clio to see metrics. Some of the metrics may appear only after requests to Clio.

View File

@@ -6,7 +6,7 @@ services:
volumes:
- ./prometheus.yaml:/etc/prometheus/prometheus.yml
command:
- "--config.file=/etc/prometheus/prometheus.yml"
- '--config.file=/etc/prometheus/prometheus.yml'
grafana:
image: grafana/grafana
ports:

View File

@@ -20,6 +20,7 @@
"graphTooltip": 0,
"id": 1,
"links": [],
"liveNow": false,
"panels": [
{
"datasource": {
@@ -78,9 +79,10 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": ["lastNotNull"],
"calcs": [
"lastNotNull"
],
"fields": "",
"values": false
},
@@ -88,7 +90,7 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "11.4.0",
"pluginVersion": "10.4.0",
"targets": [
{
"datasource": {
@@ -157,9 +159,10 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": ["lastNotNull"],
"calcs": [
"lastNotNull"
],
"fields": "",
"values": false
},
@@ -167,7 +170,7 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "11.4.0",
"pluginVersion": "10.4.0",
"targets": [
{
"datasource": {
@@ -240,9 +243,10 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": ["lastNotNull"],
"calcs": [
"lastNotNull"
],
"fields": "",
"values": false
},
@@ -250,7 +254,7 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "11.4.0",
"pluginVersion": "10.4.0",
"targets": [
{
"datasource": {
@@ -323,9 +327,10 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": ["lastNotNull"],
"calcs": [
"lastNotNull"
],
"fields": "",
"values": false
},
@@ -333,7 +338,7 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "11.4.0",
"pluginVersion": "10.4.0",
"targets": [
{
"datasource": {
@@ -368,7 +373,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -431,7 +435,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -488,7 +491,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -550,7 +552,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -585,7 +586,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -647,7 +647,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -682,7 +681,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -744,7 +742,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -779,7 +776,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -841,7 +837,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -877,7 +872,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -940,7 +934,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -948,7 +941,7 @@
"uid": "PBFA97CFB590B2093"
},
"editorMode": "code",
"expr": "sum by (method) (increase(rpc_method_duration_us[$__interval]))\n / \n sum by (method,) (increase(rpc_method_total_number{status=\"finished\"}[$__interval]))",
"expr": "rpc_method_duration_us{job=\"clio\"}",
"instant": false,
"legendFormat": "{{method}}",
"range": true,
@@ -975,7 +968,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -1037,7 +1029,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -1072,7 +1063,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -1134,7 +1124,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -1169,7 +1158,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 10,
"gradientMode": "none",
@@ -1235,7 +1223,7 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"pluginVersion": "10.2.0",
"targets": [
{
"datasource": {
@@ -1308,7 +1296,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -1370,7 +1357,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -1398,7 +1384,7 @@
"refId": "B"
}
],
"title": "DB Operations Error Rate",
"title": "DB Opperations Error Rate",
"type": "timeseries"
},
{
@@ -1418,7 +1404,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -1480,7 +1465,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -1526,7 +1510,6 @@
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"barWidthFactor": 0.6,
"drawStyle": "line",
"fillOpacity": 0,
"gradientMode": "none",
@@ -1589,7 +1572,6 @@
"sort": "none"
}
},
"pluginVersion": "11.4.0",
"targets": [
{
"datasource": {
@@ -1608,9 +1590,8 @@
"type": "timeseries"
}
],
"preload": false,
"refresh": "5s",
"schemaVersion": 40,
"schemaVersion": 39,
"tags": [],
"templating": {
"list": []

View File

@@ -1,13 +1,13 @@
apiVersion: 1
providers:
- name: "Clio dashboard"
- name: 'Clio dashboard'
# <int> Org id. Default to 1
orgId: 1
# <string> name of the dashboard folder.
folder: ""
folder: ''
# <string> folder UID. will be automatically generated if not specified
folderUid: ""
folderUid: ''
# <string> provider type. Default to 'file'
type: file
# <bool> disable dashboard deletion

View File

@@ -1,8 +1,8 @@
apiVersion: 1
datasources:
- name: Prometheus
type: prometheus
url: http://prometheus:9090
isDefault: true
access: proxy
- name: Prometheus
type: prometheus
url: http://prometheus:9090
isDefault: true
access: proxy

View File

@@ -1,19 +1,19 @@
scrape_configs:
- job_name: clio
scrape_interval: 5s
scrape_timeout: 5s
authorization:
type: Password
# sha256sum from password `xrp`
# use echo -n 'your_password' | shasum -a 256 to get hash
credentials: 0e1dcf1ff020cceabf8f4a60a32e814b5b46ee0bb8cd4af5c814e4071bd86a18
static_configs:
- targets:
- host.docker.internal:51233
- job_name: prometheus
honor_timestamps: true
scrape_interval: 15s
scrape_timeout: 10s
static_configs:
- targets:
- localhost:9090
- job_name: clio
scrape_interval: 5s
scrape_timeout: 5s
authorization:
type: Password
# sha256sum from password `xrp`
# use echo -n 'your_password' | shasum -a 256 to get hash
credentials: 0e1dcf1ff020cceabf8f4a60a32e814b5b46ee0bb8cd4af5c814e4071bd86a18
static_configs:
- targets:
- host.docker.internal:51233
- job_name: prometheus
honor_timestamps: true
scrape_interval: 15s
scrape_timeout: 10s
static_configs:
- targets:
- localhost:9090

View File

@@ -8,12 +8,12 @@ The minimum level of severity at which the log message will be outputted by defa
## `log_format`
The format of log lines produced by Clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
The format of log lines produced by Clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
Each of the variables expands like so:
- `TimeStamp`: The full date and time of the log entry
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
- `ThreadID`: The ID of the thread the log entry is written from
- `Channel`: The channel that this log entry was sent to
- `Severity`: The severity (aka log level) the entry was sent at
@@ -30,8 +30,8 @@ Each object is of this format:
```json
{
"channel": "Backend",
"log_level": "fatal"
"channel": "Backend",
"log_level": "fatal"
}
```

View File

@@ -0,0 +1,30 @@
# Metrics and static analysis
## Prometheus metrics collection
Clio natively supports [Prometheus](https://prometheus.io/) metrics collection. It accepts Prometheus requests on the port configured in the `server` section of the config.
Prometheus metrics are enabled by default, and replies to `/metrics` are compressed. To disable compression, and have human readable metrics, add `"prometheus": { "enabled": true, "compress_reply": false }` to Clio's config.
To completely disable Prometheus metrics add `"prometheus": { "enabled": false }` to Clio's config.
It is important to know that Clio responds to Prometheus request only if they are admin requests. If you are using the admin password feature, the same password should be provided in the Authorization header of Prometheus requests.
You can find an example docker-compose file, with Prometheus and Grafana configs, in [examples/infrastructure](../docs/examples/infrastructure/).
## Using `clang-tidy` for static analysis
The minimum [clang-tidy](https://clang.llvm.org/extra/clang-tidy/) version required is 19.0.
Clang-tidy can be run by Cmake when building the project. To achieve this, you just need to provide the option `-o lint=True` for the `conan install` command:
```sh
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=True
```
By default Cmake will try to find `clang-tidy` automatically in your system.
To force Cmake to use your desired binary, set the `CLIO_CLANG_TIDY_BIN` environment variable to the path of the `clang-tidy` binary. For example:
```sh
export CLIO_CLANG_TIDY_BIN=/opt/homebrew/opt/llvm@19/bin/clang-tidy
```

View File

@@ -3,11 +3,10 @@
## Prerequisites
- Access to a Cassandra cluster or ScyllaDB cluster. Can be local or remote.
> [!IMPORTANT]
> There are some key considerations when using **ScyllaDB**. By default, Scylla reserves all free RAM on a machine for itself. If you are running `rippled` or other services on the same machine, restrict its memory usage using the `--memory` argument.
>
> See [ScyllaDB in a Shared Environment](https://docs.scylladb.com/getting-started/scylla-in-a-shared-environment/) to learn more.
> [!IMPORTANT]
> There are some key considerations when using **ScyllaDB**. By default, Scylla reserves all free RAM on a machine for itself. If you are running `rippled` or other services on the same machine, restrict its memory usage using the `--memory` argument.
>
> See [ScyllaDB in a Shared Environment](https://docs.scylladb.com/getting-started/scylla-in-a-shared-environment/) to learn more.
- Access to one or more `rippled` nodes. Can be local or remote.
@@ -81,15 +80,3 @@ Clio will fallback to hardcoded defaults when these values are not specified in
> [!TIP]
> See the [example-config.json](../docs/examples/config/example-config.json) for more details.
## Prometheus metrics collection
Clio natively supports [Prometheus](https://prometheus.io/) metrics collection. It accepts Prometheus requests on the port configured in the `server` section of the config.
Prometheus metrics are enabled by default, and replies to `/metrics` are compressed. To disable compression, and have human readable metrics, add `"prometheus": { "enabled": true, "compress_reply": false }` to Clio's config.
To completely disable Prometheus metrics add `"prometheus": { "enabled": false }` to Clio's config.
It is important to know that Clio responds to Prometheus request only if they are admin requests. If you are using the admin password feature, the same password should be provided in the Authorization header of Prometheus requests.
You can find an example docker-compose file, with Prometheus and Grafana configs, in [examples/infrastructure](../docs/examples/infrastructure/).

View File

@@ -1,60 +1,47 @@
# Troubleshooting Guide
This guide will help you troubleshoot common issues of Clio.
## Can't connect to DB
If you see the error log message `Could not connect to Cassandra: No hosts available`, this means that Clio can't connect to the database. Check the following:
- Make sure the database is running at the specified address and port.
- Make sure the database is accessible from the machine where Clio is running.
You can use [cqlsh](https://pypi.org/project/cqlsh/) to check the connection to the database.
You can use [cqlsh](https://pypi.org/project/cqlsh/) to check the connection to the database.
If you would like to run a local ScyllaDB, you can call:
```sh
docker run --rm -p 9042:9042 --name clio-scylla -d scylladb/scylla
docker run --rm -p 9042:9042 --name clio-scylla -d scylladb/scylla
```
## Check the server status of Clio
To check if Clio is syncing with rippled:
```sh
curl -v -d '{"method":"server_info", "params":[{}]}' 127.0.0.1:51233|python3 -m json.tool|grep seq
```
If Clio is syncing with rippled, the `seq` value will be increasing.
## Clio fails to start
If you see the error log message `Failed to fetch ETL state from...`, this means the configured rippled node is not reachable. Check the following:
- Make sure the rippled node is running at the specified address and port.
- Make sure the rippled node is accessible from the machine where Clio is running.
If you would like to run Clio without an available rippled node, you can add below setting to Clio's configuration file:
```text
If you would like to run Clio without an avaliable rippled node, you can add below setting to Clio's configuration file:
```
"allow_no_etl": true
```
## Clio is not added to secure_gateway in rippled's config
If you see the warning message `AsyncCallData is_unlimited is false.`, this means that Clio is not added to the `secure_gateway` of `port_grpc` session in the rippled configuration file. It will slow down the sync process. Please add Clio's IP to the `secure_gateway` in the rippled configuration file for both grpc and ws port.
## Clio is slow
To speed up the response time, Clio has a cache inside. However, cache can take time to warm up. If you see slow response time, you can firstly check if cache is still loading.
You can check the cache status by calling:
```sh
curl -v -d '{"method":"server_info", "params":[{}]}' 127.0.0.1:51233|python3 -m json.tool|grep is_full
curl -v -d '{"method":"server_info", "params":[{}]}' 127.0.0.1:51233|python3 -m json.tool|grep is_enabled
```
If `is_full` is false, it means the cache is still loading. Normally, the Clio can respond quicker if cache finishes loading. If `is_enabled` is false, it means the cache is disabled in the configuration file or there is data corruption in the database.
If `is_full` is false, it means the cache is still loading. Normally, the Clio can respond quicker if cache finishs loading. If `is_enabled` is false, it means the cache is disabled in the configuration file or there is data corruption in the database.
## Receive error message `Too many requests`
If client sees the error message `Too many requests`, this means that the client is blocked by Clio's DosGuard protection. You may want to add the client's IP to the whitelist in the configuration file, Or update other your DosGuard settings.

View File

@@ -1,32 +0,0 @@
#!/bin/bash
# Note: This script is intended to be run from the root of the repository.
#
# This script checks will fix local includes in the C++ code.
# paths to fix include statements
sources="src tests"
echo "+ Fixing local includes..."
function grep_code {
grep -l "${1}" ${sources} -r --include \*.hpp --include \*.cpp
}
GNU_SED=$(sed --version 2>&1 | grep -q 'GNU' && echo true || echo false)
if [[ "$GNU_SED" == "false" ]]; then # macOS sed
# make all includes to be <...> style
grep_code '#include ".*"' | xargs sed -i '' -E 's|#include "(.*)"|#include <\1>|g'
# make local includes to be "..." style
main_src_dirs=$(find ./src -maxdepth 1 -type d -exec basename {} \; | tr '\n' '|' | sed 's/|$//' | sed 's/|/\\|/g')
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i '' -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
else
# make all includes to be <...> style
grep_code '#include ".*"' | xargs sed -i -E 's|#include "(.*)"|#include <\1>|g'
# make local includes to be "..." style
main_src_dirs=$(find ./src -maxdepth 1 -type d -exec basename {} \; | paste -sd '|' | sed 's/|/\\|/g')
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
fi

View File

@@ -1,3 +0,0 @@
#!/bin/sh
command -v git-lfs >/dev/null 2>&1 || { echo >&2 "\nThis repository is configured for Git LFS but 'git-lfs' was not found on your path. If you no longer wish to use Git LFS, remove this hook by deleting the 'pre-push' file in the hooks directory (set by 'core.hookspath'; usually '.git/hooks').\n"; exit 2; }
git lfs pre-push "$@"

View File

@@ -1,14 +0,0 @@
#!/usr/bin/env bash
#
# Capture and print stdout, since gofmt doesn't use proper exit codes
#
set -e -o pipefail
if ! command -v gofmt &> /dev/null ; then
echo "gofmt not installed or available in the PATH" >&2
exit 1
fi
output="$(gofmt -l -w "$@")"
echo "$output"
[[ -z "$output" ]]

View File

@@ -1,11 +1,9 @@
add_subdirectory(util)
add_subdirectory(data)
add_subdirectory(cluster)
add_subdirectory(etl)
add_subdirectory(etlng)
add_subdirectory(feed)
add_subdirectory(rpc)
add_subdirectory(web)
add_subdirectory(migration)
add_subdirectory(app)
add_subdirectory(main)

View File

@@ -1,13 +1,4 @@
add_library(clio_app)
target_sources(clio_app PRIVATE CliArgs.cpp ClioApplication.cpp Stopper.cpp WebHandlers.cpp)
target_sources(clio_app PRIVATE CliArgs.cpp ClioApplication.cpp)
target_link_libraries(
clio_app
PUBLIC clio_cluster
clio_etl
clio_etlng
clio_feed
clio_web
clio_rpc
clio_migration
)
target_link_libraries(clio_app PUBLIC clio_etl clio_etlng clio_feed clio_web clio_rpc)

View File

@@ -19,9 +19,7 @@
#include "app/CliArgs.hpp"
#include "migration/MigrationApplication.hpp"
#include "util/build/Build.hpp"
#include "util/newconfig/ConfigDescription.hpp"
#include <boost/program_options/options_description.hpp>
#include <boost/program_options/parsers.hpp>
@@ -30,7 +28,6 @@
#include <boost/program_options/variables_map.hpp>
#include <cstdlib>
#include <filesystem>
#include <iostream>
#include <string>
#include <utility>
@@ -44,13 +41,9 @@ CliArgs::parse(int argc, char const* argv[])
// clang-format off
po::options_description description("Options");
description.add_options()
("help,h", "Print help message and exit")
("version,v", "Print version and exit")
("conf,c", po::value<std::string>()->default_value(kDEFAULT_CONFIG_PATH), "Configuration file")
("ng-web-server,w", "Use ng-web-server")
("migrate", po::value<std::string>(), "Start migration helper")
("verify", "Checks the validity of config values")
("config-description,d", po::value<std::string>(), "Generate config description markdown file")
("help,h", "print help message and exit")
("version,v", "print version and exit")
("conf,c", po::value<std::string>()->default_value(defaultConfigPath), "configuration file")
;
// clang-format on
po::positional_options_description positional;
@@ -70,31 +63,8 @@ CliArgs::parse(int argc, char const* argv[])
return Action{Action::Exit{EXIT_SUCCESS}};
}
if (parsed.count("config-description") != 0u) {
std::filesystem::path const filePath = parsed["config-description"].as<std::string>();
auto const res = util::config::ClioConfigDescription::generateConfigDescriptionToFile(filePath);
if (res.has_value())
return Action{Action::Exit{EXIT_SUCCESS}};
std::cerr << res.error().error << std::endl;
return Action{Action::Exit{EXIT_FAILURE}};
}
auto configPath = parsed["conf"].as<std::string>();
if (parsed.count("migrate") != 0u) {
auto const opt = parsed["migrate"].as<std::string>();
if (opt == "status")
return Action{Action::Migrate{.configPath = std::move(configPath), .subCmd = MigrateSubCmd::status()}};
return Action{Action::Migrate{.configPath = std::move(configPath), .subCmd = MigrateSubCmd::migration(opt)}};
}
if (parsed.count("verify") != 0u)
return Action{Action::VerifyConfig{.configPath = std::move(configPath)}};
return Action{Action::Run{.configPath = std::move(configPath), .useNgWebServer = parsed.count("ng-web-server") != 0}
};
return Action{Action::Run{std::move(configPath)}};
}
} // namespace app

View File

@@ -19,7 +19,6 @@
#pragma once
#include "migration/MigrationApplication.hpp"
#include "util/OverloadSet.hpp"
#include <string>
@@ -35,7 +34,7 @@ public:
/**
* @brief Default configuration path.
*/
static constexpr char kDEFAULT_CONFIG_PATH[] = "/etc/opt/clio/config.json";
static constexpr char defaultConfigPath[] = "/etc/opt/clio/config.json";
/**
* @brief An action parsed from the command line.
@@ -44,24 +43,14 @@ public:
public:
/** @brief Run action. */
struct Run {
std::string configPath; ///< Configuration file path.
bool useNgWebServer; ///< Whether to use a ng web server
/** @brief Configuration file path. */
std::string configPath;
};
/** @brief Exit action. */
struct Exit {
int exitCode; ///< Exit code.
};
/** @brief Migration action. */
struct Migrate {
std::string configPath;
MigrateSubCmd subCmd;
};
/** @brief Verify Config action. */
struct VerifyConfig {
std::string configPath;
/** @brief Exit code. */
int exitCode;
};
/**
@@ -70,8 +59,7 @@ public:
* @param action Run action.
*/
template <typename ActionType>
requires std::is_same_v<ActionType, Run> or std::is_same_v<ActionType, Exit> or
std::is_same_v<ActionType, Migrate> or std::is_same_v<ActionType, VerifyConfig>
requires std::is_same_v<ActionType, Run> or std::is_same_v<ActionType, Exit>
explicit Action(ActionType&& action) : action_(std::forward<ActionType>(action))
{
}
@@ -91,7 +79,7 @@ public:
}
private:
std::variant<Run, Exit, Migrate, VerifyConfig> action_;
std::variant<Run, Exit> action_;
};
/**

View File

@@ -19,44 +19,32 @@
#include "app/ClioApplication.hpp"
#include "app/Stopper.hpp"
#include "app/WebHandlers.hpp"
#include "cluster/ClusterCommunicationService.hpp"
#include "data/AmendmentCenter.hpp"
#include "data/BackendFactory.hpp"
#include "data/LedgerCache.hpp"
#include "etl/ETLService.hpp"
#include "etl/LoadBalancer.hpp"
#include "etl/NetworkValidatedLedgers.hpp"
#include "etlng/LoadBalancer.hpp"
#include "etlng/LoadBalancerInterface.hpp"
#include "feed/SubscriptionManager.hpp"
#include "migration/MigrationInspectorFactory.hpp"
#include "rpc/Counters.hpp"
#include "rpc/RPCEngine.hpp"
#include "rpc/WorkQueue.hpp"
#include "rpc/common/impl/HandlerProvider.hpp"
#include "util/build/Build.hpp"
#include "util/config/Config.hpp"
#include "util/log/Logger.hpp"
#include "util/newconfig/ConfigDefinition.hpp"
#include "util/prometheus/Prometheus.hpp"
#include "web/AdminVerificationStrategy.hpp"
#include "web/RPCServerHandler.hpp"
#include "web/Server.hpp"
#include "web/dosguard/DOSGuard.hpp"
#include "web/dosguard/IntervalSweepHandler.hpp"
#include "web/dosguard/WhitelistHandler.hpp"
#include "web/ng/RPCServerHandler.hpp"
#include "web/ng/Server.hpp"
#include <boost/asio/io_context.hpp>
#include <cstdint>
#include <cstdlib>
#include <memory>
#include <optional>
#include <thread>
#include <utility>
#include <vector>
namespace app {
@@ -84,18 +72,20 @@ start(boost::asio::io_context& ioc, std::uint32_t numThreads)
} // namespace
ClioApplication::ClioApplication(util::config::ClioConfigDefinition const& config)
: config_(config), signalsHandler_{config_}
ClioApplication::ClioApplication(util::Config const& config) : config_(config), signalsHandler_{config_}
{
LOG(util::LogService::info()) << "Clio version: " << util::build::getClioFullVersionString();
PrometheusService::init(config);
signalsHandler_.subscribeToStop([this]() { appStopper_.stop(); });
}
int
ClioApplication::run(bool const useNgWebServer)
ClioApplication::run()
{
auto const threads = config_.get<uint16_t>("io_threads");
auto const threads = config_.valueOr("io_threads", 2);
if (threads <= 0) {
LOG(util::LogService::fatal()) << "io_threads is less than 1";
return EXIT_FAILURE;
}
LOG(util::LogService::info()) << "Number of io threads = " << threads;
// IO context to handle all incoming requests, as well as other things.
@@ -106,102 +96,40 @@ ClioApplication::run(bool const useNgWebServer)
auto whitelistHandler = web::dosguard::WhitelistHandler{config_};
auto dosGuard = web::dosguard::DOSGuard{config_, whitelistHandler};
auto sweepHandler = web::dosguard::IntervalSweepHandler{config_, ioc, dosGuard};
auto cache = data::LedgerCache{};
// Interface to the database
auto backend = data::makeBackend(config_, cache);
cluster::ClusterCommunicationService clusterCommunicationService{backend};
clusterCommunicationService.run();
auto const amendmentCenter = std::make_shared<data::AmendmentCenter const>(backend);
{
auto const migrationInspector = migration::makeMigrationInspector(config_, backend);
// Check if any migration is blocking Clio server starting.
if (migrationInspector->isBlockingClio() and backend->hardFetchLedgerRangeNoThrow()) {
LOG(util::LogService::error())
<< "Existing Migration is blocking Clio, Please complete the database migration first.";
return EXIT_FAILURE;
}
}
auto backend = data::make_Backend(config_);
// Manages clients subscribed to streams
auto subscriptions = feed::SubscriptionManager::makeSubscriptionManager(config_, backend, amendmentCenter);
auto subscriptions = feed::SubscriptionManager::make_SubscriptionManager(config_, backend);
// Tracks which ledgers have been validated by the network
auto ledgers = etl::NetworkValidatedLedgers::makeValidatedLedgers();
auto ledgers = etl::NetworkValidatedLedgers::make_ValidatedLedgers();
// Handles the connection to one or more rippled nodes.
// ETL uses the balancer to extract data.
// The server uses the balancer to forward RPCs to a rippled node.
// The balancer itself publishes to streams (transactions_proposed and accounts_proposed)
auto balancer = [&] -> std::shared_ptr<etlng::LoadBalancerInterface> {
if (config_.get<bool>("__ng_etl"))
return etlng::LoadBalancer::makeLoadBalancer(config_, ioc, backend, subscriptions, ledgers);
return etl::LoadBalancer::makeLoadBalancer(config_, ioc, backend, subscriptions, ledgers);
}();
auto balancer = etl::LoadBalancer::make_LoadBalancer(config_, ioc, backend, subscriptions, ledgers);
// ETL is responsible for writing and publishing to streams. In read-only mode, ETL only publishes
auto etl = etl::ETLService::makeETLService(config_, ioc, backend, subscriptions, balancer, ledgers);
auto workQueue = rpc::WorkQueue::makeWorkQueue(config_);
auto counters = rpc::Counters::makeCounters(workQueue);
auto etl = etl::ETLService::make_ETLService(config_, ioc, backend, subscriptions, balancer, ledgers);
auto workQueue = rpc::WorkQueue::make_WorkQueue(config_);
auto counters = rpc::Counters::make_Counters(workQueue);
auto const amendmentCenter = std::make_shared<data::AmendmentCenter const>(backend);
auto const handlerProvider = std::make_shared<rpc::impl::ProductionHandlerProvider const>(
config_, backend, subscriptions, balancer, etl, amendmentCenter, counters
);
using RPCEngineType = rpc::RPCEngine<rpc::Counters>;
using RPCEngineType = rpc::RPCEngine<etl::LoadBalancer, rpc::Counters>;
auto const rpcEngine =
RPCEngineType::makeRPCEngine(config_, backend, balancer, dosGuard, workQueue, counters, handlerProvider);
if (useNgWebServer or config_.get<bool>("server.__ng_web_server")) {
web::ng::RPCServerHandler<RPCEngineType> handler{config_, backend, rpcEngine, etl};
auto expectedAdminVerifier = web::makeAdminVerificationStrategy(config_);
if (not expectedAdminVerifier.has_value()) {
LOG(util::LogService::error()) << "Error creating admin verifier: " << expectedAdminVerifier.error();
return EXIT_FAILURE;
}
auto const adminVerifier = std::move(expectedAdminVerifier).value();
auto httpServer = web::ng::makeServer(config_, OnConnectCheck{dosGuard}, DisconnectHook{dosGuard}, ioc);
if (not httpServer.has_value()) {
LOG(util::LogService::error()) << "Error creating web server: " << httpServer.error();
return EXIT_FAILURE;
}
httpServer->onGet("/metrics", MetricsHandler{adminVerifier});
httpServer->onGet("/health", HealthCheckHandler{});
auto requestHandler = RequestHandler{adminVerifier, handler, dosGuard};
httpServer->onPost("/", requestHandler);
httpServer->onWs(std::move(requestHandler));
auto const maybeError = httpServer->run();
if (maybeError.has_value()) {
LOG(util::LogService::error()) << "Error starting web server: " << *maybeError;
return EXIT_FAILURE;
}
appStopper_.setOnStop(
Stopper::makeOnStopCallback(httpServer.value(), *balancer, *etl, *subscriptions, *backend, ioc)
);
// Blocks until stopped.
// When stopped, shared_ptrs fall out of scope
// Calls destructors on all resources, and destructs in order
start(ioc, threads);
return EXIT_SUCCESS;
}
RPCEngineType::make_RPCEngine(config_, backend, balancer, dosGuard, workQueue, counters, handlerProvider);
// Init the web server
auto handler = std::make_shared<web::RPCServerHandler<RPCEngineType>>(config_, backend, rpcEngine, etl);
auto const httpServer = web::makeHttpServer(config_, ioc, dosGuard, handler);
auto handler =
std::make_shared<web::RPCServerHandler<RPCEngineType, etl::ETLService>>(config_, backend, rpcEngine, etl);
auto const httpServer = web::make_HttpServer(config_, ioc, dosGuard, handler);
// Blocks until stopped.
// When stopped, shared_ptrs fall out of scope

Some files were not shown because too many files have changed in this diff Show More