Compare commits

..

3 Commits

Author SHA1 Message Date
Michael Legleux
ec5d1eb65c Set version to 1.0.4 2023-02-02 11:59:47 -08:00
Michael Legleux
5b417bdc45 1.0.4 Release Candidate 2 (#465)
* Implement logging abstraction (#371)

Fixes #290

* Fix pre-commit to only check staged files

* Implement account ownership check and fix paging (#383)

Fixes #222

* Remove the github action package signing step

This will be done elsewhere.

* include searched_all in error response of tx (#407)

* helper function for subscribe to ensure cleanup (#402)

* Add closed to header for all paths of ledger_data (#416)

Fixes #219

* Add custom error for malformed owner and request (#417)

Fixes #274

* Use custom malformedAddress error in ledger_entry (#419)

Fixes #272

* Return lgrIdxsInvalid error for ledger_max_index less than ledger_min_index (#339)

Fixes #263

* Update headers to use #pragma once

* Add custom error for malformed request (#414)

Fixes #276

* Return srcCurMalformed on invalid taker_pays in book_offers (#413)

Fixes #267

* Fix source_location issue on MacOSX and Debug build (#431)

Fixes #428

* Implement always adding git ref to version string (#430)

Fixes #427

* add connection counting (#433)

* Fix malformed output format over ws rpc (#426)

Fixes #405

* Remove branch name from version string (#437)

Fixes a bug from #430

* Implement cli parsing using boost::po (#436)

Fixes #367

* Update documentation and config with ssl_cert_file and ssl_key_file (#443)

Fixes #424

* Fix gateway balances to match rippled output (#441)

Fixes #271

* Update README and example config to describe start_sequence (#438)

Fixes #250

* Add copyright to top of each source file (#444)

Fixes #411

* Increase file descriptor limit (#449)

* Update readme with more log configurations (#447)

Fixes #446

* Document dos_guard in example config. Log when client surpasses rate limit (#451)

* Add unit tests for DOSGuard (#453)

Fixes #452

* Build macOS and Ubuntu 22.04 (#456)

build release/x.y.z branches

* Add time measurement profiler (#458)

Rebase

* Match format to rippled error code (#461)

Fixes #263

* Change error message to match rippled (#463)

Fixes #263

* Add requests limit to DosGuard (#462)

Fixing #448

* Set version to 1.0.4-rc2

Co-authored-by: Alex Kremer <akremer@ripple.com>
Co-authored-by: CJ Cobb <46455409+cjcobb23@users.noreply.github.com>
Co-authored-by: Francis Mendoza <francissamuelmendoza7@gmail.com>
Co-authored-by: cyan317 <120398799+cindyyan317@users.noreply.github.com>
2023-01-10 16:04:07 -08:00
manojsdoshi
ce631a1f5a Set version to 1.0.4-rc1 2022-11-17 11:33:48 -08:00
1069 changed files with 26026 additions and 172210 deletions

View File

@@ -1,7 +1,7 @@
---
Language: Cpp
Language: Cpp
AccessModifierOffset: -4
AlignAfterOpenBracket: BlockIndent
AlignAfterOpenBracket: AlwaysBreak
AlignConsecutiveAssignments: false
AlignConsecutiveDeclarations: false
AlignEscapedNewlinesLeft: true
@@ -18,38 +18,46 @@ AlwaysBreakBeforeMultilineStrings: true
AlwaysBreakTemplateDeclarations: true
BinPackArguments: false
BinPackParameters: false
BraceWrapping:
AfterClass: true
AfterControlStatement: true
AfterEnum: false
AfterFunction: true
AfterNamespace: false
AfterObjCDeclaration: true
AfterStruct: true
AfterUnion: true
BeforeCatch: true
BeforeElse: true
IndentBraces: false
BreakBeforeBinaryOperators: false
BreakBeforeBraces: WebKit
BreakBeforeBraces: Custom
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: true
ColumnLimit: 120
CommentPragmas: "^ IWYU pragma:"
ColumnLimit: 80
CommentPragmas: '^ IWYU pragma:'
ConstructorInitializerAllOnOneLineOrOnePerLine: true
ConstructorInitializerIndentWidth: 4
ContinuationIndentWidth: 4
Cpp11BracedListStyle: true
DerivePointerAlignment: false
DisableFormat: false
DisableFormat: false
ExperimentalAutoDetectBinPacking: false
FixNamespaceComments: true
ForEachMacros: [Q_FOREACH, BOOST_FOREACH]
IncludeBlocks: Regroup
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
IncludeCategories:
- Regex: '^".*"$'
Priority: 1
- Regex: '^<.*\.(h|hpp)>$'
Priority: 2
- Regex: "^<.*>$"
Priority: 3
- Regex: ".*"
Priority: 4
IncludeIsMainRegex: "$"
- Regex: '^<(BeastConfig)'
Priority: 0
- Regex: '^<(ripple)/'
Priority: 2
- Regex: '^<(boost)/'
Priority: 3
- Regex: '.*'
Priority: 4
IncludeIsMainRegex: '$'
IndentCaseLabels: true
IndentFunctionDeclarationAfterType: false
IndentWidth: 4
IndentWidth: 4
IndentWrappedFunctionNames: false
IndentRequiresClause: true
RequiresClausePosition: OwnLine
KeepEmptyLinesAtTheStartOfBlocks: false
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
@@ -62,19 +70,18 @@ PenaltyBreakString: 1000
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 200
PointerAlignment: Left
QualifierAlignment: Right
ReflowComments: true
SortIncludes: true
ReflowComments: true
SortIncludes: true
SpaceAfterCStyleCast: false
SpaceBeforeAssignmentOperators: true
SpaceBeforeParens: ControlStatements
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 2
SpacesInAngles: false
SpacesInAngles: false
SpacesInContainerLiterals: true
SpacesInCStyleCastParentheses: false
SpacesInParentheses: false
SpacesInSquareBrackets: false
Standard: Cpp11
TabWidth: 8
UseTab: Never
Standard: Cpp11
TabWidth: 8
UseTab: Never

View File

@@ -1,191 +0,0 @@
---
Checks: "-*,
bugprone-argument-comment,
bugprone-assert-side-effect,
bugprone-bad-signal-to-kill-thread,
bugprone-bool-pointer-implicit-conversion,
bugprone-casting-through-void,
bugprone-chained-comparison,
bugprone-compare-pointer-to-member-virtual-function,
bugprone-copy-constructor-init,
bugprone-crtp-constructor-accessibility,
bugprone-dangling-handle,
bugprone-dynamic-static-initializers,
bugprone-empty-catch,
bugprone-fold-init-type,
bugprone-forward-declaration-namespace,
bugprone-inaccurate-erase,
bugprone-inc-dec-in-conditions,
bugprone-incorrect-enable-if,
bugprone-incorrect-roundings,
bugprone-infinite-loop,
bugprone-integer-division,
bugprone-lambda-function-name,
bugprone-macro-parentheses,
bugprone-macro-repeated-side-effects,
bugprone-misplaced-operator-in-strlen-in-alloc,
bugprone-misplaced-pointer-arithmetic-in-alloc,
bugprone-misplaced-widening-cast,
bugprone-move-forwarding-reference,
bugprone-multi-level-implicit-pointer-conversion,
bugprone-multiple-new-in-one-expression,
bugprone-multiple-statement-macro,
bugprone-no-escape,
bugprone-non-zero-enum-to-bool-conversion,
bugprone-optional-value-conversion,
bugprone-parent-virtual-call,
bugprone-pointer-arithmetic-on-polymorphic-object,
bugprone-posix-return,
bugprone-redundant-branch-condition,
bugprone-reserved-identifier,
bugprone-return-const-ref-from-parameter,
bugprone-shared-ptr-array-mismatch,
bugprone-signal-handler,
bugprone-signed-char-misuse,
bugprone-sizeof-container,
bugprone-sizeof-expression,
bugprone-spuriously-wake-up-functions,
bugprone-standalone-empty,
bugprone-string-constructor,
bugprone-string-integer-assignment,
bugprone-string-literal-with-embedded-nul,
bugprone-stringview-nullptr,
bugprone-suspicious-enum-usage,
bugprone-suspicious-include,
bugprone-suspicious-memory-comparison,
bugprone-suspicious-memset-usage,
bugprone-suspicious-missing-comma,
bugprone-suspicious-realloc-usage,
bugprone-suspicious-semicolon,
bugprone-suspicious-string-compare,
bugprone-suspicious-stringview-data-usage,
bugprone-swapped-arguments,
bugprone-switch-missing-default-case,
bugprone-terminating-continue,
bugprone-throw-keyword-missing,
bugprone-too-small-loop-variable,
bugprone-undefined-memory-manipulation,
bugprone-undelegated-constructor,
bugprone-unhandled-exception-at-new,
bugprone-unhandled-self-assignment,
bugprone-unique-ptr-array-mismatch,
bugprone-unsafe-functions,
bugprone-unused-local-non-trivial-variable,
bugprone-unused-raii,
bugprone-unused-return-value,
bugprone-use-after-move,
bugprone-virtual-near-miss,
cppcoreguidelines-init-variables,
cppcoreguidelines-misleading-capture-default-by-value,
cppcoreguidelines-no-suspend-with-lock,
cppcoreguidelines-pro-type-member-init,
cppcoreguidelines-pro-type-static-cast-downcast,
cppcoreguidelines-rvalue-reference-param-not-moved,
cppcoreguidelines-use-default-member-init,
cppcoreguidelines-virtual-class-destructor,
hicpp-ignored-remove-result,
llvm-namespace-comment,
misc-const-correctness,
misc-definitions-in-headers,
misc-header-include-cycle,
misc-include-cleaner,
misc-misplaced-const,
misc-redundant-expression,
misc-static-assert,
misc-throw-by-value-catch-by-reference,
misc-unused-alias-decls,
misc-unused-using-decls,
modernize-concat-nested-namespaces,
modernize-deprecated-headers,
modernize-make-shared,
modernize-make-unique,
modernize-pass-by-value,
modernize-type-traits,
modernize-use-designated-initializers,
modernize-use-emplace,
modernize-use-equals-default,
modernize-use-equals-delete,
modernize-use-override,
modernize-use-ranges,
modernize-use-starts-ends-with,
modernize-use-std-numbers,
modernize-use-using,
performance-faster-string-find,
performance-for-range-copy,
performance-implicit-conversion-in-loop,
performance-inefficient-vector-operation,
performance-move-const-arg,
performance-move-constructor-init,
performance-no-automatic-move,
performance-trivially-destructible,
readability-avoid-nested-conditional-operator,
readability-avoid-return-with-void-value,
readability-braces-around-statements,
readability-const-return-type,
readability-container-contains,
readability-container-size-empty,
readability-convert-member-functions-to-static,
readability-duplicate-include,
readability-else-after-return,
readability-enum-initial-value,
readability-implicit-bool-conversion,
readability-inconsistent-declaration-parameter-name,
readability-identifier-naming,
readability-make-member-function-const,
readability-math-missing-parentheses,
readability-misleading-indentation,
readability-non-const-parameter,
readability-redundant-casting,
readability-redundant-declaration,
readability-redundant-inline-specifier,
readability-redundant-member-init,
readability-redundant-string-init,
readability-reference-to-constructed-temporary,
readability-simplify-boolean-expr,
readability-static-accessed-through-instance,
readability-static-definition-in-anonymous-namespace,
readability-suspicious-call-argument,
readability-use-std-min-max
"
CheckOptions:
readability-braces-around-statements.ShortStatementLines: 2
readability-identifier-naming.MacroDefinitionCase: UPPER_CASE
readability-identifier-naming.ClassCase: CamelCase
readability-identifier-naming.StructCase: CamelCase
readability-identifier-naming.UnionCase: CamelCase
readability-identifier-naming.EnumCase: CamelCase
readability-identifier-naming.EnumConstantCase: CamelCase
readability-identifier-naming.ScopedEnumConstantCase: CamelCase
readability-identifier-naming.GlobalConstantCase: UPPER_CASE
readability-identifier-naming.GlobalConstantPrefix: "k"
readability-identifier-naming.GlobalVariableCase: CamelCase
readability-identifier-naming.GlobalVariablePrefix: "g"
readability-identifier-naming.ConstexprFunctionCase: camelBack
readability-identifier-naming.ConstexprMethodCase: camelBack
readability-identifier-naming.ClassMethodCase: camelBack
readability-identifier-naming.ClassMemberCase: camelBack
readability-identifier-naming.ClassConstantCase: UPPER_CASE
readability-identifier-naming.ClassConstantPrefix: "k"
readability-identifier-naming.StaticConstantCase: UPPER_CASE
readability-identifier-naming.StaticConstantPrefix: "k"
readability-identifier-naming.StaticVariableCase: UPPER_CASE
readability-identifier-naming.StaticVariablePrefix: "k"
readability-identifier-naming.ConstexprVariableCase: UPPER_CASE
readability-identifier-naming.ConstexprVariablePrefix: "k"
readability-identifier-naming.LocalConstantCase: camelBack
readability-identifier-naming.LocalVariableCase: camelBack
readability-identifier-naming.TemplateParameterCase: CamelCase
readability-identifier-naming.ParameterCase: camelBack
readability-identifier-naming.FunctionCase: camelBack
readability-identifier-naming.MemberCase: camelBack
readability-identifier-naming.PrivateMemberSuffix: _
readability-identifier-naming.ProtectedMemberSuffix: _
readability-identifier-naming.PublicMemberSuffix: ""
readability-identifier-naming.FunctionIgnoredRegexp: ".*tag_invoke.*"
bugprone-unsafe-functions.ReportMoreUnsafeFunctions: true
bugprone-unused-return-value.CheckedReturnTypes: ::std::error_code;::std::error_condition;::std::errc
misc-include-cleaner.IgnoreHeaders: '.*/(detail|impl)/.*;.*(expected|unexpected).*;.*ranges_lower_bound\.h;time.h;stdlib.h;__chrono/.*;fmt/chrono.h;boost/uuid/uuid_hash.hpp'
HeaderFilterRegex: '^.*/(src|tests)/.*\.(h|hpp)$'
WarningsAsErrors: "*"

13
.clangd
View File

@@ -1,13 +0,0 @@
CompileFlags:
Add: [-D__cpp_concepts=202002]
Diagnostics:
UnusedIncludes: Strict
MissingIncludes: Strict
Includes:
IgnoreHeader:
- ".*/(detail|impl)/.*"
- ".*expected.*"
- ".*ranges_lower_bound.h"
- "time.h"
- "stdlib.h"

View File

@@ -1,245 +0,0 @@
_help_parse: Options affecting listfile parsing
parse:
_help_additional_commands:
- Specify structure for custom cmake functions
additional_commands:
foo:
flags:
- BAR
- BAZ
kwargs:
HEADERS: "*"
SOURCES: "*"
DEPENDS: "*"
_help_override_spec:
- Override configurations per-command where available
override_spec: {}
_help_vartags:
- Specify variable tags.
vartags: []
_help_proptags:
- Specify property tags.
proptags: []
_help_format: Options affecting formatting.
format:
_help_disable:
- Disable formatting entirely, making cmake-format a no-op
disable: false
_help_line_width:
- How wide to allow formatted cmake files
line_width: 120
_help_tab_size:
- How many spaces to tab for indent
tab_size: 2
_help_use_tabchars:
- If true, lines are indented using tab characters (utf-8
- 0x09) instead of <tab_size> space characters (utf-8 0x20).
- In cases where the layout would require a fractional tab
- character, the behavior of the fractional indentation is
- governed by <fractional_tab_policy>
use_tabchars: false
_help_fractional_tab_policy:
- If <use_tabchars> is True, then the value of this variable
- indicates how fractional indentions are handled during
- whitespace replacement. If set to 'use-space', fractional
- indentation is left as spaces (utf-8 0x20). If set to
- "`round-up` fractional indentation is replaced with a single"
- tab character (utf-8 0x09) effectively shifting the column
- to the next tabstop
fractional_tab_policy: use-space
_help_max_subgroups_hwrap:
- If an argument group contains more than this many sub-groups
- (parg or kwarg groups) then force it to a vertical layout.
max_subgroups_hwrap: 4
_help_max_pargs_hwrap:
- If a positional argument group contains more than this many
- arguments, then force it to a vertical layout.
max_pargs_hwrap: 6
_help_max_rows_cmdline:
- If a cmdline positional group consumes more than this many
- lines without nesting, then invalidate the layout (and nest)
max_rows_cmdline: 2
_help_separate_ctrl_name_with_space:
- If true, separate flow control names from their parentheses
- with a space
separate_ctrl_name_with_space: true
_help_separate_fn_name_with_space:
- If true, separate function names from parentheses with a
- space
separate_fn_name_with_space: false
_help_dangle_parens:
- If a statement is wrapped to more than one line, than dangle
- the closing parenthesis on its own line.
dangle_parens: true
_help_dangle_align:
- If the trailing parenthesis must be 'dangled' on its on
- "line, then align it to this reference: `prefix`: the start"
- "of the statement, `prefix-indent`: the start of the"
- "statement, plus one indentation level, `child`: align to"
- the column of the arguments
dangle_align: prefix
_help_min_prefix_chars:
- If the statement spelling length (including space and
- parenthesis) is smaller than this amount, then force reject
- nested layouts.
min_prefix_chars: 4
_help_max_prefix_chars:
- If the statement spelling length (including space and
- parenthesis) is larger than the tab width by more than this
- amount, then force reject un-nested layouts.
max_prefix_chars: 10
_help_max_lines_hwrap:
- If a candidate layout is wrapped horizontally but it exceeds
- this many lines, then reject the layout.
max_lines_hwrap: 2
_help_line_ending:
- What style line endings to use in the output.
line_ending: unix
_help_command_case:
- Format command names consistently as 'lower' or 'upper' case
command_case: canonical
_help_keyword_case:
- Format keywords consistently as 'lower' or 'upper' case
keyword_case: unchanged
_help_always_wrap:
- A list of command names which should always be wrapped
always_wrap: []
_help_enable_sort:
- If true, the argument lists which are known to be sortable
- will be sorted lexicographicall
enable_sort: true
_help_autosort:
- If true, the parsers may infer whether or not an argument
- list is sortable (without annotation).
autosort: true
_help_require_valid_layout:
- By default, if cmake-format cannot successfully fit
- everything into the desired linewidth it will apply the
- last, most aggressive attempt that it made. If this flag is
- True, however, cmake-format will print error, exit with non-
- zero status code, and write-out nothing
require_valid_layout: false
_help_layout_passes:
- A dictionary mapping layout nodes to a list of wrap
- decisions. See the documentation for more information.
layout_passes: {}
_help_markup: Options affecting comment reflow and formatting.
markup:
_help_bullet_char:
- What character to use for bulleted lists
bullet_char: "*"
_help_enum_char:
- What character to use as punctuation after numerals in an
- enumerated list
enum_char: .
_help_first_comment_is_literal:
- If comment markup is enabled, don't reflow the first comment
- block in each listfile. Use this to preserve formatting of
- your copyright/license statements.
first_comment_is_literal: false
_help_literal_comment_pattern:
- If comment markup is enabled, don't reflow any comment block
- which matches this (regex) pattern. Default is `None`
- (disabled).
literal_comment_pattern: null
_help_fence_pattern:
- Regular expression to match preformat fences in comments
- default= ``r'^\s*([`~]{3}[`~]*)(.*)$'``
fence_pattern: ^\s*([`~]{3}[`~]*)(.*)$
_help_ruler_pattern:
- Regular expression to match rulers in comments default=
- '``r''^\s*[^\w\s]{3}.*[^\w\s]{3}$''``'
ruler_pattern: ^\s*[^\w\s]{3}.*[^\w\s]{3}$
_help_explicit_trailing_pattern:
- If a comment line matches starts with this pattern then it
- is explicitly a trailing comment for the preceding
- argument. Default is '#<'
explicit_trailing_pattern: "#<"
_help_hashruler_min_length:
- If a comment line starts with at least this many consecutive
- hash characters, then don't lstrip() them off. This allows
- for lazy hash rulers where the first hash char is not
- separated by space
hashruler_min_length: 10
_help_canonicalize_hashrulers:
- If true, then insert a space between the first hash char and
- remaining hash chars in a hash ruler, and normalize its
- length to fill the column
canonicalize_hashrulers: true
_help_enable_markup:
- enable comment markup parsing and reflow
enable_markup: true
_help_lint: Options affecting the linter
lint:
_help_disabled_codes:
- a list of lint codes to disable
disabled_codes: []
_help_function_pattern:
- regular expression pattern describing valid function names
function_pattern: "[0-9a-z_]+"
_help_macro_pattern:
- regular expression pattern describing valid macro names
macro_pattern: "[0-9A-Z_]+"
_help_global_var_pattern:
- regular expression pattern describing valid names for
- variables with global (cache) scope
global_var_pattern: "[A-Z][0-9A-Z_]+"
_help_internal_var_pattern:
- regular expression pattern describing valid names for
- variables with global scope (but internal semantic)
internal_var_pattern: _[A-Z][0-9A-Z_]+
_help_local_var_pattern:
- regular expression pattern describing valid names for
- variables with local scope
local_var_pattern: "[a-z][a-z0-9_]+"
_help_private_var_pattern:
- regular expression pattern describing valid names for
- privatedirectory variables
private_var_pattern: _[0-9a-z_]+
_help_public_var_pattern:
- regular expression pattern describing valid names for public
- directory variables
public_var_pattern: "[A-Z][0-9A-Z_]+"
_help_argument_var_pattern:
- regular expression pattern describing valid names for
- function/macro arguments and loop variables.
argument_var_pattern: "[a-z][a-z0-9_]+"
_help_keyword_pattern:
- regular expression pattern describing valid names for
- keywords used in functions or macros
keyword_pattern: "[A-Z][0-9A-Z_]+"
_help_max_conditionals_custom_parser:
- In the heuristic for C0201, how many conditionals to match
- within a loop in before considering the loop a parser.
max_conditionals_custom_parser: 2
_help_min_statement_spacing:
- Require at least this many newlines between statements
min_statement_spacing: 1
_help_max_statement_spacing:
- Require no more than this many newlines between statements
max_statement_spacing: 2
max_returns: 6
max_branches: 12
max_arguments: 5
max_localvars: 15
max_statements: 50
_help_encode: Options affecting file encoding
encode:
_help_emit_byteorder_mark:
- If true, emit the unicode byte-order mark (BOM) at the start
- of the file
emit_byteorder_mark: false
_help_input_encoding:
- Specify the encoding of the input file. Defaults to utf-8
input_encoding: utf-8
_help_output_encoding:
- Specify the encoding of the output file. Defaults to utf-8.
- Note that cmake only claims to support utf-8 so be careful
- when using anything else
output_encoding: utf-8
_help_misc: Miscellaneous configurations options.
misc:
_help_per_command:
- A dictionary containing any per-command configuration
- overrides. Currently only `command_case` is supported.
per_command: {}

View File

@@ -1,23 +0,0 @@
coverage:
status:
project:
default:
target: 50%
threshold: 2%
patch:
default:
target: 20% # Need to bump this number https://docs.codecov.com/docs/commit-status#patch-status
threshold: 2%
# `codecov/codecov-action` reruns `gcovr` if build files present
# That's why we run it in a separate workflow
# This ignore list is not currently used
#
# More info: https://github.com/XRPLF/clio/pull/2066
ignore:
- "benchmarks"
- "tests"
- "src/data/cassandra/"
- "src/data/CassandraBackend.hpp"
- "src/data/BackendFactory.*"

View File

@@ -7,4 +7,3 @@
# clang-format
e41150248a97e4bdc1cf21b54650c4bb7c63928e
2e542e7b0d94451a933c88778461cc8d3d7e6417
d816ef54abd8e8e979b9c795bdb657a8d18f5e95

2
.gitattributes vendored
View File

@@ -1,2 +0,0 @@
*.png filter=lfs diff=lfs merge=lfs -text
*.svg filter=lfs diff=lfs merge=lfs -text

25
.githooks/pre-commit Executable file
View File

@@ -0,0 +1,25 @@
#!/bin/bash
exec 1>&2
# paths to check and re-format
sources="src unittests"
formatter="clang-format -i"
first=$(git diff $sources)
find $sources -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 $formatter
second=$(git diff $sources)
changes=$(diff <(echo "$first") <(echo "$second") | wc -l | sed -e 's/^[[:space:]]*//')
if [ "$changes" != "0" ]; then
cat <<\EOF
WARNING
-----------------------------------------------------------------------------
Automatically re-formatted code with `clang-format` - commit was aborted.
Please manually add any updated files and commit again.
-----------------------------------------------------------------------------
EOF
exit 1
fi

View File

@@ -1,36 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: "[Title with short description] (Version: [Clio version])"
labels: bug
assignees: ""
---
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Issue Description
<!-- Provide a summary for your issue/bug. -->
## Steps to Reproduce
<!-- List in detail the exact steps to reproduce the unexpected behavior of the software. -->
## Expected Result
<!-- Explain in detail what behavior you expected to happen. -->
## Actual Result
<!-- Explain in detail what behavior actually happened. -->
## Environment
<!-- Please describe your environment setup (such as Ubuntu 20.04.2 with Boost 1.82). -->
<!-- Please use the version returned by './clio_server --version' as the version number -->
## Supporting Files
<!-- If you have supporting files such as a log, feel free to post a link here using Github Gist. -->
<!-- Consider adding configuration files with private information removed via Github Gist. -->

View File

@@ -1,26 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[Title with short description] (Version: [Clio version])"
labels: enhancement
assignees: ""
---
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Summary
<!-- Provide a summary to the feature request -->
## Motivation
<!-- Why do we need this feature? -->
## Solution
<!-- What is the solution? -->
## Paths Not Taken
<!-- What other alternatives have been considered? -->

View File

@@ -1,19 +0,0 @@
---
name: Question
about: A question in form of an issue
title: "[Title with short description] (Version: Clio version)"
labels: question
assignees: ""
---
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Consider starting a [discussion](https://github.com/XRPLF/clio/discussions) instead. -->
<!-- Kindly refrain from posting any credentials or sensitive information in this issue -->
## Question
<!-- Your question -->
## Paths Not Taken
<!-- If applicable, what other alternatives have been considered? -->

View File

@@ -1,31 +0,0 @@
name: Build clio
description: Build clio in build directory
inputs:
targets:
description: Space-separated build target names
default: all
subtract_threads:
description: An option for the action get-threads-number.
required: true
default: "0"
runs:
using: composite
steps:
- name: Get number of threads
uses: ./.github/actions/get-threads-number
id: number_of_threads
with:
subtract_threads: ${{ inputs.subtract_threads }}
- name: Build targets
shell: bash
env:
CMAKE_TARGETS: ${{ inputs.targets }}
run: |
cd build
cmake \
--build . \
--parallel "${{ steps.number_of_threads.outputs.threads_number }}" \
--target ${CMAKE_TARGETS}

View File

@@ -1,68 +0,0 @@
name: Build and push Docker image
description: Build and push Docker image to DockerHub and GitHub Container Registry
inputs:
images:
description: Name of the images to use as a base name
required: true
push_image:
description: Whether to push the image to the registry (true/false)
required: true
directory:
description: The directory containing the Dockerfile
required: true
tags:
description: Comma separated tags to apply to the image
required: true
platforms:
description: Platforms to build the image for (e.g. linux/amd64,linux/arm64)
required: true
build_args:
description: List of build-time variables
required: false
dockerhub_repo:
description: DockerHub repository name
required: false
default: ""
dockerhub_description:
description: Short description of the image
required: false
runs:
using: composite
steps:
- name: Login to DockerHub
if: ${{ inputs.push_image == 'true' && inputs.dockerhub_repo != '' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_PW }}
- name: Login to GitHub Container Registry
if: ${{ inputs.push_image == 'true' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ env.GITHUB_TOKEN }}
- uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v3.6.0
with:
cache-image: false
- uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f # v5.8.0
id: meta
with:
images: ${{ inputs.images }}
tags: ${{ inputs.tags }}
- name: Build and push
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ inputs.directory }}
platforms: ${{ inputs.platforms }}
push: ${{ inputs.push_image == 'true' }}
tags: ${{ steps.meta.outputs.tags }}
build-args: ${{ inputs.build_args }}

View File

@@ -1,73 +0,0 @@
name: Run CMake
description: Run CMake to generate build files
inputs:
build_dir:
description: Build directory
required: false
default: "build"
conan_profile:
description: Conan profile name
required: true
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
integration_tests:
description: Whether to generate target integration tests
required: true
default: "true"
benchmark:
description: Whether to generate targets for benchmarks
required: true
default: "true"
code_coverage:
description: Whether to enable code coverage
required: true
default: "false"
static:
description: Whether Clio is to be statically linked
required: true
default: "false"
time_trace:
description: Whether to enable compiler trace reports
required: true
default: "false"
package:
description: Whether to generate Debian package
required: true
default: "false"
runs:
using: composite
steps:
- name: Run cmake
shell: bash
env:
BUILD_TYPE: "${{ inputs.build_type }}"
SANITIZER_OPTION: |-
${{ endsWith(inputs.conan_profile, '.asan') && '-Dsan=address' ||
endsWith(inputs.conan_profile, '.tsan') && '-Dsan=thread' ||
endsWith(inputs.conan_profile, '.ubsan') && '-Dsan=undefined' ||
'' }}
INTEGRATION_TESTS: "${{ inputs.integration_tests == 'true' && 'ON' || 'OFF' }}"
BENCHMARK: "${{ inputs.benchmark == 'true' && 'ON' || 'OFF' }}"
COVERAGE: "${{ inputs.code_coverage == 'true' && 'ON' || 'OFF' }}"
STATIC: "${{ inputs.static == 'true' && 'ON' || 'OFF' }}"
TIME_TRACE: "${{ inputs.time_trace == 'true' && 'ON' || 'OFF' }}"
PACKAGE: "${{ inputs.package == 'true' && 'ON' || 'OFF' }}"
run: |
cmake \
-B ${{inputs.build_dir}} \
-S . \
-G Ninja \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
"${SANITIZER_OPTION}" \
-Dtests=ON \
-Dintegration_tests="${INTEGRATION_TESTS}" \
-Dbenchmark="${BENCHMARK}" \
-Dcoverage="${COVERAGE}" \
-Dstatic="${STATIC}" \
-Dtime_trace="${TIME_TRACE}" \
-Dpackage="${PACKAGE}"

View File

@@ -1,31 +0,0 @@
name: Generate code coverage report
description: Run tests, generate code coverage report and upload it to codecov.io
runs:
using: composite
steps:
- name: Run tests
shell: bash
run: |
build/clio_tests
# Please keep exclude list in sync with .codecov.yml
- name: Run gcovr
shell: bash
run: |
gcovr \
-e benchmarks \
-e tests \
-e src/data/cassandra \
-e src/data/CassandraBackend.hpp \
-e 'src/data/BackendFactory.*' \
--xml build/coverage_report.xml \
-j8 --exclude-throw-branches
- name: Archive coverage report
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: coverage-report.xml
path: build/coverage_report.xml
retention-days: 30

View File

@@ -1,38 +0,0 @@
name: Run Conan
description: Run conan to install dependencies
inputs:
build_dir:
description: Build directory
required: false
default: "build"
conan_profile:
description: Conan profile name
required: true
force_conan_source_build:
description: Whether conan should build all dependencies from source
required: true
default: "false"
build_type:
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
required: true
default: "Release"
runs:
using: composite
steps:
- name: Create build directory
shell: bash
run: mkdir -p "${{ inputs.build_dir }}"
- name: Run conan
shell: bash
env:
CONAN_BUILD_OPTION: "${{ inputs.force_conan_source_build == 'true' && '*' || 'missing' }}"
run: |
conan \
install . \
-of build \
-b "$CONAN_BUILD_OPTION" \
-s "build_type=${{ inputs.build_type }}" \
--profile:all "${{ inputs.conan_profile }}"

View File

@@ -1,46 +0,0 @@
name: Create an issue
description: Create an issue
inputs:
title:
description: Issue title
required: true
body:
description: Issue body
required: true
labels:
description: Comma-separated list of labels
required: true
default: "bug"
assignees:
description: Comma-separated list of assignees
required: true
default: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"
outputs:
created_issue_id:
description: Created issue id
value: ${{ steps.create_issue.outputs.created_issue }}
runs:
using: composite
steps:
- name: Create an issue
id: create_issue
shell: bash
env:
ISSUE_BODY: ${{ inputs.body }}
ISSUE_ASSIGNEES: ${{ inputs.assignees }}
ISSUE_LABELS: ${{ inputs.labels }}
ISSUE_TITLE: ${{ inputs.title }}
run: |
echo -e "${ISSUE_BODY}" > issue.md
gh issue create \
--assignee "${ISSUE_ASSIGNEES}" \
--label "${ISSUE_LABELS}" \
--title "${ISSUE_TITLE}" \
--body-file ./issue.md \
> create_issue.log
created_issue="$(sed 's|.*/||' create_issue.log)"
echo "created_issue=$created_issue" >> $GITHUB_OUTPUT
rm create_issue.log issue.md

View File

@@ -1,38 +0,0 @@
name: Get number of threads
description: Determines number of threads to use on macOS and Linux
inputs:
subtract_threads:
description: How many threads to subtract from the calculated number
required: true
default: "0"
outputs:
threads_number:
description: Number of threads to use
value: ${{ steps.number_of_threads_export.outputs.num }}
runs:
using: composite
steps:
- name: Get number of threads on mac
id: mac_threads
if: ${{ runner.os == 'macOS' }}
shell: bash
run: echo "num=$(($(sysctl -n hw.logicalcpu) - 2))" >> $GITHUB_OUTPUT
- name: Get number of threads on Linux
id: linux_threads
if: ${{ runner.os == 'Linux' }}
shell: bash
run: echo "num=$(($(nproc) - 2))" >> $GITHUB_OUTPUT
- name: Shift and export number of threads
id: number_of_threads_export
shell: bash
env:
SUBTRACT_THREADS: ${{ inputs.subtract_threads }}
run: |
num_of_threads="${{ steps.mac_threads.outputs.num || steps.linux_threads.outputs.num }}"
shift_by="${SUBTRACT_THREADS}"
shifted="$((num_of_threads - shift_by))"
echo "num=$(( shifted > 1 ? shifted : 1 ))" >> $GITHUB_OUTPUT

View File

@@ -1,16 +0,0 @@
name: Git common ancestor
description: Find the closest common commit
outputs:
commit:
description: Hash of commit
value: ${{ steps.find_common_ancestor.outputs.commit }}
runs:
using: composite
steps:
- name: Find common git ancestor
id: find_common_ancestor
shell: bash
run: |
echo "commit=\"$(git merge-base --fork-point origin/develop)\"" >> $GITHUB_OUTPUT

13
.github/actions/lint/action.yml vendored Normal file
View File

@@ -0,0 +1,13 @@
runs:
using: composite
steps:
# Github's ubuntu-20.04 image already has clang-format-11 installed
- run: |
find src unittests -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 clang-format-11 -i
shell: bash
- name: Check for differences
id: assert
shell: bash
run: |
git diff --color --exit-code | tee "clang-format.patch"

View File

@@ -1,38 +0,0 @@
name: Restore cache
description: Find and restores ccache cache
inputs:
conan_profile:
description: Conan profile name
required: true
ccache_dir:
description: Path to .ccache directory
required: true
build_type:
description: Current build type (e.g. Release, Debug)
required: true
default: Release
code_coverage:
description: Whether code coverage is on
required: true
default: "false"
outputs:
ccache_cache_hit:
description: True if ccache cache has been downloaded
value: ${{ steps.ccache_cache.outputs.cache-hit }}
runs:
using: composite
steps:
- name: Find common commit
id: git_common_ancestor
uses: ./.github/actions/git-common-ancestor
- name: Restore ccache cache
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
id: ccache_cache
if: ${{ env.CCACHE_DISABLE != '1' }}
with:
path: ${{ inputs.ccache_dir }}
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-${{ inputs.conan_profile }}-develop-${{ steps.git_common_ancestor.outputs.commit }}

View File

@@ -1,38 +0,0 @@
name: Save cache
description: Save ccache cache for develop branch
inputs:
conan_profile:
description: Conan profile name
required: true
ccache_dir:
description: Path to .ccache directory
required: true
build_type:
description: Current build type (e.g. Release, Debug)
required: true
default: Release
code_coverage:
description: Whether code coverage is on
required: true
default: "false"
ccache_cache_hit:
description: Whether ccache cache has been downloaded
required: true
ccache_cache_miss_rate:
description: How many ccache cache misses happened
runs:
using: composite
steps:
- name: Find common commit
id: git_common_ancestor
uses: ./.github/actions/git-common-ancestor
- name: Save ccache cache
if: ${{ inputs.ccache_cache_hit != 'true' || inputs.ccache_cache_miss_rate == '100.0' }}
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ inputs.ccache_dir }}
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-${{ inputs.conan_profile }}-develop-${{ steps.git_common_ancestor.outputs.commit }}

6
.github/actions/test/Dockerfile vendored Normal file
View File

@@ -0,0 +1,6 @@
FROM cassandra:4.0.4
RUN apt-get update && apt-get install -y postgresql
COPY entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

8
.github/actions/test/entrypoint.sh vendored Executable file
View File

@@ -0,0 +1,8 @@
#!/bin/bash
pg_ctlcluster 12 main start
su postgres -c"psql -c\"alter user postgres with password 'postgres'\""
su cassandra -c "/opt/cassandra/bin/cassandra -R"
sleep 90
chmod +x ./clio_tests
./clio_tests

144
.github/dependabot.yml vendored
View File

@@ -1,144 +0,0 @@
version: 2
updates:
- package-ecosystem: github-actions
directory: /
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build-clio/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/build-docker-image/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/cmake/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/code-coverage/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/conan/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/create-issue/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/get-threads-number/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/git-common-ancestor/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/restore-cache/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop
- package-ecosystem: github-actions
directory: .github/actions/save-cache/
schedule:
interval: weekly
day: monday
time: "04:00"
timezone: Etc/GMT
reviewers:
- XRPLF/clio-dev-team
commit-message:
prefix: "ci: [DEPENDABOT] "
target-branch: develop

View File

@@ -1,11 +0,0 @@
[settings]
arch={{detect_api.detect_arch()}}
build_type=Release
compiler=apple-clang
compiler.cppstd=20
compiler.libcxx=libc++
compiler.version=17
os=Macos
[conf]
grpc/1.50.1:tools.build:cxxflags+=["-Wno-missing-template-arg-list-after-template-kw"]

View File

@@ -1,41 +0,0 @@
#!/usr/bin/env python3
import itertools
import json
LINUX_OS = ["heavy", "heavy-arm64"]
LINUX_CONTAINERS = [
'{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
]
LINUX_COMPILERS = ["gcc", "clang"]
MACOS_OS = ["macos15"]
MACOS_CONTAINERS = [""]
MACOS_COMPILERS = ["apple-clang"]
BUILD_TYPES = ["Release", "Debug"]
SANITIZER_EXT = [".asan", ".tsan", ".ubsan", ""]
def generate_matrix():
configurations = []
for os, container, compiler in itertools.chain(
itertools.product(LINUX_OS, LINUX_CONTAINERS, LINUX_COMPILERS),
itertools.product(MACOS_OS, MACOS_CONTAINERS, MACOS_COMPILERS),
):
for sanitizer_ext, build_type in itertools.product(SANITIZER_EXT, BUILD_TYPES):
configurations.append(
{
"os": os,
"container": container,
"compiler": compiler,
"sanitizer_ext": sanitizer_ext,
"build_type": build_type,
}
)
return {"include": configurations}
if __name__ == "__main__":
print(f"matrix={json.dumps(generate_matrix())}")

View File

@@ -1,48 +0,0 @@
#!/bin/bash
set -ex
CURRENT_DIR="$(cd "$(dirname "$0")" && pwd)"
REPO_DIR="$(cd "$CURRENT_DIR/../../../" && pwd)"
CONAN_DIR="${CONAN_HOME:-$HOME/.conan2}"
PROFILES_DIR="$CONAN_DIR/profiles"
# When developers' compilers are updated, these profiles might be different
if [[ -z "$CI" ]]; then
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-17.profile"
else
APPLE_CLANG_PROFILE="$CURRENT_DIR/apple-clang-17.profile"
fi
GCC_PROFILE="$REPO_DIR/docker/ci/conan/gcc.profile"
CLANG_PROFILE="$REPO_DIR/docker/ci/conan/clang.profile"
SANITIZER_TEMPLATE_FILE="$REPO_DIR/docker/ci/conan/sanitizer_template.profile"
rm -rf "$CONAN_DIR"
conan remote add --index 0 xrplf https://conan.ripplex.io
cp "$REPO_DIR/docker/ci/conan/global.conf" "$CONAN_DIR/global.conf"
create_profile_with_sanitizers() {
profile_name="$1"
profile_source="$2"
cp "$profile_source" "$PROFILES_DIR/$profile_name"
cp "$SANITIZER_TEMPLATE_FILE" "$PROFILES_DIR/$profile_name.asan"
cp "$SANITIZER_TEMPLATE_FILE" "$PROFILES_DIR/$profile_name.tsan"
cp "$SANITIZER_TEMPLATE_FILE" "$PROFILES_DIR/$profile_name.ubsan"
}
mkdir -p "$PROFILES_DIR"
if [[ "$(uname)" == "Darwin" ]]; then
create_profile_with_sanitizers "apple-clang" "$APPLE_CLANG_PROFILE"
echo "include(apple-clang)" > "$PROFILES_DIR/default"
else
create_profile_with_sanitizers "clang" "$CLANG_PROFILE"
create_profile_with_sanitizers "gcc" "$GCC_PROFILE"
echo "include(gcc)" > "$PROFILES_DIR/default"
fi

View File

@@ -1,46 +0,0 @@
#!/bin/bash
set -o pipefail
# Note: This script is intended to be run from the root of the repository.
#
# This script runs each unit-test separately and generates reports from the currently active sanitizer.
# Output is saved in ./.sanitizer-report in the root of the repository
if [[ -z "$1" ]]; then
cat <<EOF
ERROR
-----------------------------------------------------------------------------
Path to clio_tests should be passed as first argument to the script.
-----------------------------------------------------------------------------
EOF
exit 1
fi
TEST_BINARY=$1
if [[ ! -f "$TEST_BINARY" ]]; then
echo "Test binary not found: $TEST_BINARY"
exit 1
fi
TESTS=$($TEST_BINARY --gtest_list_tests | awk '/^ / {print suite $1} !/^ / {suite=$1}')
OUTPUT_DIR="./.sanitizer-report"
mkdir -p "$OUTPUT_DIR"
export TSAN_OPTIONS="die_after_fork=0"
export MallocNanoZone='0' # for MacOSX
for TEST in $TESTS; do
OUTPUT_FILE="$OUTPUT_DIR/${TEST//\//_}.log"
$TEST_BINARY --gtest_filter="$TEST" > "$OUTPUT_FILE" 2>&1
if [ $? -ne 0 ]; then
echo "'$TEST' failed a sanitizer check."
else
rm "$OUTPUT_FILE"
fi
done

View File

@@ -1,24 +0,0 @@
#!/bin/bash
set -ex -o pipefail
BINARY_NAME="clio_server"
ARTIFACTS_DIR="$1"
if [ -z "${ARTIFACTS_DIR}" ]; then
echo "Usage: $0 <artifacts_directory>"
exit 1
fi
cd "${ARTIFACTS_DIR}" || exit 1
for artifact_name in $(ls); do
pushd "${artifact_name}" || exit 1
zip -r "../${artifact_name}.zip" ./${BINARY_NAME}
popd || exit 1
rm "${artifact_name}/${BINARY_NAME}"
rm -r "${artifact_name}"
sha256sum "./${artifact_name}.zip" > "./${artifact_name}.zip.sha256sum"
done

View File

@@ -1,109 +0,0 @@
name: Build and publish Clio docker image
on:
workflow_call:
inputs:
tags:
required: true
type: string
description: Comma separated tags for docker image
artifact_name:
type: string
description: Name of Github artifact to put into docker image
strip_binary:
type: boolean
description: Whether to strip clio binary
default: true
publish_image:
type: boolean
description: Whether to publish docker image
required: true
workflow_dispatch:
inputs:
tags:
required: true
type: string
description: Comma separated tags for docker image
clio_server_binary_url:
required: true
type: string
description: Url to download clio_server binary from
binary_sha256:
required: true
type: string
description: sha256 hash of the binary
strip_binary:
type: boolean
description: Whether to strip clio binary
default: true
jobs:
build_and_publish_image:
name: Build and publish image
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Download Clio binary from artifact
if: ${{ inputs.artifact_name != null }}
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: ${{ inputs.artifact_name }}
path: ./docker/clio/artifact/
- name: Download Clio binary from url
if: ${{ inputs.clio_server_binary_url != null }}
shell: bash
env:
BINARY_URL: ${{ inputs.clio_server_binary_url }}
BINARY_SHA256: ${{ inputs.binary_sha256 }}
run: |
wget "${BINARY_URL}" -P ./docker/clio/artifact/
if [ "$(sha256sum ./docker/clio/clio_server | awk '{print $1}')" != "${BINARY_SHA256}" ]; then
echo "Binary sha256 sum doesn't match"
exit 1
fi
- name: Unpack binary
shell: bash
run: |
sudo apt update && sudo apt install -y tar unzip
cd docker/clio/artifact
artifact=$(find . -type f)
if [[ $artifact == *.zip ]]; then
unzip $artifact
elif [[ $artifact == *.tar.gz ]]; then
tar -xvf $artifact
fi
chmod +x ./clio_server
mv ./clio_server ../
cd ../
rm -rf ./artifact
- name: Strip binary
if: ${{ inputs.strip_binary }}
shell: bash
run: strip ./docker/clio/clio_server
- name: Set GHCR_REPO
id: set-ghcr-repo
run: |
echo "GHCR_REPO=$(echo ghcr.io/${{ github.repository_owner }} | tr '[:upper:]' '[:lower:]')" >> ${GITHUB_OUTPUT}
- name: Build Docker image
uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
ghcr.io/${{ steps.set-ghcr-repo.outputs.GHCR_REPO }}/clio
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio' || '' }}
push_image: ${{ inputs.publish_image }}
directory: docker/clio
tags: ${{ inputs.tags }}
platforms: linux/amd64
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio' || '' }}
dockerhub_description: Clio is an XRP Ledger API server.

View File

@@ -1,132 +1,147 @@
name: Build
name: Build Clio
on:
push:
branches: [release/*, develop]
branches: [master, release/*, develop, develop-next]
pull_request:
branches: [release/*, develop]
paths:
- .github/workflows/build.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- .github/workflows/reusable-upload-coverage-report.yml
- ".github/actions/**"
- "!.github/actions/build-docker-image/**"
- "!.github/actions/create-issue/**"
- CMakeLists.txt
- conanfile.py
- conan.lock
- "cmake/**"
- "src/**"
- "tests/**"
- docs/config-description.md
branches: [master, release/*, develop, develop-next]
workflow_dispatch:
concurrency:
# Develop branch: Each run gets unique group (using run_number) for parallel execution
# Other branches: Shared group with cancel-in-progress to stop old runs when new commits are pushed
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.ref == 'refs/heads/develop' && github.run_number || 'branch' }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
lint:
name: Lint
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- name: Run clang-format
uses: ./.github/actions/lint
build_clio:
name: Build Clio
runs-on: [self-hosted, Linux]
needs: lint
strategy:
fail-fast: false
matrix:
os: [heavy]
conan_profile: [gcc, clang]
build_type: [Release, Debug]
container:
[
'{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }',
]
static: [true]
type:
- suffix: deb
image: rippleci/clio-dpkg-builder:2022-09-17
script: dpkg
- suffix: rpm
image: rippleci/clio-rpm-builder:2022-09-17
script: rpm
include:
- os: macos15
conan_profile: apple-clang
build_type: Release
container: ""
static: false
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
download_ccache: true
upload_ccache: true
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: false
upload_clio_server: true
code_coverage:
name: Run Code Coverage
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
conan_profile: gcc
build_type: Debug
download_ccache: true
upload_ccache: false
code_coverage: true
static: true
upload_clio_server: false
targets: all
analyze_build_time: false
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
package:
name: Build packages
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
conan_profile: gcc
build_type: Release
download_ccache: true
upload_ccache: false
code_coverage: false
static: true
upload_clio_server: false
package: true
targets: package
analyze_build_time: false
check_config:
name: Check Config Description
needs: build-and-test
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
image: ${{ matrix.type.image }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
- uses: actions/checkout@v3
with:
name: clio_server_Linux_Release_gcc
path: clio
- name: Compare Config Description
- name: Clone Clio packaging repo
uses: actions/checkout@v3
with:
path: clio-packages
repository: XRPLF/clio-packages
- name: Build
shell: bash
run: |
repoConfigFile=docs/config-description.md
configDescriptionFile=config_description_new.md
export CLIO_ROOT=$(realpath clio)
if [ ${{ matrix.type.suffix }} == "rpm" ]; then
source /opt/rh/devtoolset-11/enable
fi
cmake -S clio-packages -B clio-packages/build -DCLIO_ROOT=$CLIO_ROOT
cmake --build clio-packages/build --parallel $(nproc)
cp ./clio-packages/build/clio-prefix/src/clio-build/clio_tests .
mv ./clio-packages/build/*.${{ matrix.type.suffix }} .
chmod +x ./clio_server
./clio_server -d "${configDescriptionFile}"
- name: Artifact packages
uses: actions/upload-artifact@v3
with:
name: clio_${{ matrix.type.suffix }}_packages
path: ${{ github.workspace }}/*.${{ matrix.type.suffix }}
diff -u "${repoConfigFile}" "${configDescriptionFile}"
- name: Artifact clio_tests
uses: actions/upload-artifact@v3
with:
name: clio_tests-${{ matrix.type.suffix }}
path: ${{ github.workspace }}/clio_tests
build_dev:
name: ${{ matrix.os.name }} test
needs: lint
continue-on-error: ${{ matrix.os.experimental }}
strategy:
fail-fast: false
matrix:
os:
- name: ubuntu-22.04
experimental: true
- name: macos-11
experimental: true
- name: macos-12
experimental: false
runs-on: ${{ matrix.os.name }}
steps:
- uses: actions/checkout@v3
with:
path: clio
- name: Check Boost cache
id: boost
uses: actions/cache@v3
with:
path: boost
key: ${{ runner.os }}-boost
- name: Build boost
if: steps.boost.outputs.cache-hit != 'true'
run: |
curl -s -OJL "https://boostorg.jfrog.io/artifactory/main/release/1.77.0/source/boost_1_77_0.tar.gz"
tar zxf boost_1_77_0.tar.gz
mv boost_1_77_0 boost
cd boost
./bootstrap.sh
if [[ ${{ matrix.os.name }} =~ mac ]];then
mac_flags='cxxflags="-std=c++14"'
fi
./b2 ${mac_flags}
- name: install deps
run: |
if [[ ${{ matrix.os.name }} =~ mac ]];then
brew install pkg-config protobuf openssl ninja cassandra-cpp-driver bison
elif [[ ${{matrix.os.name }} =~ ubuntu ]];then
sudo apt-get -y install git pkg-config protobuf-compiler libprotobuf-dev libssl-dev wget build-essential doxygen bison flex autoconf clang-format
fi
- name: Build clio
run: |
export BOOST_ROOT=$(pwd)/boost
cd clio
cmake -B build
if ! cmake --build build -j$(nproc); then
echo '# 🔥${{ matrix.os.name }}🔥 failed!💥' >> $GITHUB_STEP_SUMMARY
fi
test_clio:
name: Test Clio
runs-on: [self-hosted, Linux]
needs: build_clio
strategy:
fail-fast: false
matrix:
suffix: [rpm, deb]
steps:
- uses: actions/checkout@v3
- name: Get clio_tests artifact
uses: actions/download-artifact@v3
with:
name: clio_tests-${{ matrix.suffix }}
- name: Run tests
timeout-minutes: 10
uses: ./.github/actions/test

View File

@@ -1,106 +0,0 @@
name: Check new libXRPL
on:
repository_dispatch:
types: [check_libxrpl]
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_PROFILE: gcc
jobs:
build:
name: Build Clio / `libXRPL ${{ github.event.client_payload.version }}`
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Update libXRPL version requirement
shell: bash
run: |
sed -i.bak -E "s|'xrpl/[a-zA-Z0-9\\.\\-]+'|'xrpl/${{ github.event.client_payload.conan_ref }}'|g" conanfile.py
rm -f conanfile.py.bak
- name: Update conan lockfile
shell: bash
run: |
conan lock create . --profile:all ${{ env.CONAN_PROFILE }}
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Build Clio
uses: ./.github/actions/build-clio
- name: Strip tests
run: strip build/clio_tests
- name: Upload clio_tests
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: clio_tests_check_libxrpl
path: build/clio_tests
run_tests:
name: Run tests
needs: build
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: clio_tests_check_libxrpl
- name: Run clio_tests
run: |
chmod +x ./clio_tests
./clio_tests
create_issue_on_failure:
name: Create an issue on failure
needs: [build, run_tests]
if: ${{ always() && contains(needs.*.result, 'failure') }}
runs-on: ubuntu-latest
permissions:
contents: write
issues: write
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Create an issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
labels: "compatibility,bug"
title: "Proposed libXRPL check failed"
body: >
Clio build or tests failed against `libXRPL ${{ github.event.client_payload.conan_ref }}`.
PR: ${{ github.event.client_payload.pr_url }}
Workflow run: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/

View File

@@ -1,26 +0,0 @@
name: Check PR title
on:
pull_request:
types: [opened, edited, reopened, synchronize]
branches: [develop]
jobs:
check_title:
runs-on: ubuntu-latest
steps:
- uses: ytanikin/pr-conventional-commits@b72758283dcbee706975950e96bc4bf323a8d8c0 # 1.4.2
with:
task_types: '["build","feat","fix","docs","test","ci","style","refactor","perf","chore"]'
add_label: false
custom_labels: '{"build":"build", "feat":"enhancement", "fix":"bug", "docs":"documentation", "test":"testability", "ci":"ci", "style":"refactoring", "refactor":"refactoring", "perf":"performance", "chore":"tooling"}'
- name: Check if message starts with upper-case letter
env:
PR_TITLE: ${{ github.event.pull_request.title }}
run: |
if [[ ! "${PR_TITLE}" =~ ^[a-z]+:\ [\[A-Z] ]]; then
echo "Error: PR title must start with an upper-case letter."
exit 1
fi

View File

@@ -1,130 +0,0 @@
name: Clang-tidy check
on:
push:
branches: [develop]
schedule:
- cron: "0 9 * * 1-5"
workflow_dispatch:
pull_request:
branches: [develop]
paths:
- .github/workflows/clang-tidy.yml
- .clang_tidy
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
CONAN_PROFILE: clang
LLVM_TOOLS_VERSION: 20
jobs:
clang_tidy:
if: github.event_name != 'push' || contains(github.event.head_commit.message, 'clang-tidy auto fixes')
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
permissions:
contents: write
issues: write
pull-requests: write
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Restore cache
uses: ./.github/actions/restore-cache
id: restore_cache
with:
conan_profile: ${{ env.CONAN_PROFILE }}
ccache_dir: ${{ env.CCACHE_DIR }}
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ env.CONAN_PROFILE }}
- name: Get number of threads
uses: ./.github/actions/get-threads-number
id: number_of_threads
- name: Run clang-tidy
continue-on-error: true
shell: bash
id: run_clang_tidy
run: |
run-clang-tidy-${{ env.LLVM_TOOLS_VERSION }} -p build -j "${{ steps.number_of_threads.outputs.threads_number }}" -fix -quiet 1>output.txt
- name: Fix local includes and clang-format style
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
shell: bash
run: |
pre-commit run --all-files fix-local-includes || true
pre-commit run --all-files clang-format || true
- name: Print issues found
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
shell: bash
run: |
sed -i '/error\||/!d' ./output.txt
cat output.txt
rm output.txt
- name: Create an issue
if: ${{ steps.run_clang_tidy.outcome != 'success' && github.event_name != 'pull_request' }}
id: create_issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
title: "Clang-tidy found bugs in code 🐛"
body: >
Clang-tidy found issues in the code:
List of the issues found: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/
- uses: crazy-max/ghaction-import-gpg@e89d40939c28e39f97cf32126055eeae86ba74ec # v6.3.0
if: ${{ steps.run_clang_tidy.outcome != 'success' && github.event_name != 'pull_request' }}
with:
gpg_private_key: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.ACTIONS_GPG_PASSPHRASE }}
git_user_signingkey: true
git_commit_gpgsign: true
- name: Create PR with fixes
if: ${{ steps.run_clang_tidy.outcome != 'success' && github.event_name != 'pull_request' }}
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
with:
commit-message: "[CI] clang-tidy auto fixes"
committer: Clio CI <skuznetsov@ripple.com>
branch: "clang_tidy/autofix"
branch-suffix: timestamp
delete-branch: true
title: "style: clang-tidy auto fixes"
body: "Fixes #${{ steps.create_issue.outputs.created_issue_id }}. Please review and commit clang-tidy fixes."
reviewers: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"
- name: Fail the job
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
shell: bash
run: exit 1

View File

@@ -1,67 +0,0 @@
name: Documentation
on:
push:
branches: [develop]
workflow_dispatch:
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build:
runs-on: ubuntu-latest
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
lfs: true
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Create build directory
run: mkdir build_docs
- name: Configure CMake
working-directory: build_docs
run: cmake ../docs
- name: Build
working-directory: build_docs
run: cmake --build . --target docs
- name: Setup Pages
uses: actions/configure-pages@983d7736d9b0ae728b81ab479565c72886d7745b # v5.0.0
- name: Upload artifact
uses: actions/upload-pages-artifact@7b1f4a764d45c48632c6b24a0339c27f5614fb0b # v4.0.0
with:
path: build_docs/html
name: docs-develop
deploy:
needs: build
permissions:
pages: write
id-token: write
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@d6db90164ac5ed86f2b6aed7e0febac5b3c0c03e # v4.0.5
with:
artifact_name: docs-develop

View File

@@ -1,144 +0,0 @@
name: Nightly release
on:
schedule:
- cron: "0 8 * * 1-5"
workflow_dispatch:
pull_request:
paths:
- .github/workflows/nightly.yml
- .github/workflows/reusable-release.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- .github/workflows/build-clio-docker-image.yml
- ".github/actions/**"
- "!.github/actions/code-coverage/**"
- .github/scripts/prepare-release-artifacts.sh
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
strategy:
fail-fast: false
matrix:
include:
- os: macos15
conan_profile: apple-clang
build_type: Release
static: false
- os: heavy
conan_profile: gcc
build_type: Release
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
- os: heavy
conan_profile: gcc
build_type: Debug
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
- os: heavy
conan_profile: gcc.ubsan
build_type: Release
static: false
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: true
upload_clio_server: true
download_ccache: false
upload_ccache: false
analyze_build_time:
name: Analyze Build Time
strategy:
fail-fast: false
matrix:
include:
- os: heavy
conan_profile: clang
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
static: true
- os: macos15
conan_profile: apple-clang
container: ""
static: false
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: Release
download_ccache: false
upload_ccache: false
code_coverage: false
static: ${{ matrix.static }}
upload_clio_server: false
targets: all
analyze_build_time: true
nightly_release:
needs: build-and-test
uses: ./.github/workflows/reusable-release.yml
with:
overwrite_release: true
prerelease: true
title: "Clio development (nightly) build"
version: nightly
header: >
> **Note:** Please remember that this is a development release and it is not recommended for production use.
Changelog (including previous releases): <https://github.com/XRPLF/clio/commits/nightly>
generate_changelog: false
draft: false
build_and_publish_docker_image:
uses: ./.github/workflows/build-clio-docker-image.yml
needs: build-and-test
secrets: inherit
with:
tags: |
type=raw,value=nightly
type=raw,value=${{ github.sha }}
artifact_name: clio_server_Linux_Release_gcc
strip_binary: true
publish_image: ${{ github.event_name != 'pull_request' }}
create_issue_on_failure:
needs: [build-and-test, nightly_release, build_and_publish_docker_image]
if: ${{ always() && contains(needs.*.result, 'failure') && github.event_name != 'pull_request' }}
runs-on: ubuntu-latest
permissions:
contents: write
issues: write
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Create an issue
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
title: "Nightly release failed 🌙"
body: >
Nightly release failed:
Workflow: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/

View File

@@ -1,22 +0,0 @@
name: Pre-commit auto-update
on:
# every first day of the month
schedule:
- cron: "0 0 1 * *"
pull_request:
branches: [release/*, develop]
paths:
- ".pre-commit-config.yaml"
workflow_dispatch:
jobs:
auto-update:
uses: XRPLF/actions/.github/workflows/pre-commit-autoupdate.yml@afbcbdafbe0ce5439492fb87eda6441371086386
with:
sign_commit: true
committer: "Clio CI <skuznetsov@ripple.com>"
reviewers: "godexsoft,kuznetsss,PeterChen13579,mathbunnyru"
secrets:
GPG_PRIVATE_KEY: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
GPG_PASSPHRASE: ${{ secrets.ACTIONS_GPG_PASSPHRASE }}

View File

@@ -1,14 +0,0 @@
name: Run pre-commit hooks
on:
pull_request:
push:
branches: [develop]
workflow_dispatch:
jobs:
run-hooks:
uses: XRPLF/actions/.github/workflows/pre-commit.yml@a8d7472b450eb53a1e5228f64552e5974457a21a
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-pre-commit:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'

View File

@@ -1,59 +0,0 @@
name: Create release
on:
push:
tags:
- "*.*.*"
pull_request:
paths:
- .github/workflows/release.yml
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
strategy:
fail-fast: false
matrix:
include:
- os: macos15
conan_profile: apple-clang
build_type: Release
static: false
- os: heavy
conan_profile: gcc
build_type: Release
static: true
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: ${{ matrix.os }}
container: ${{ matrix.container }}
conan_profile: ${{ matrix.conan_profile }}
build_type: ${{ matrix.build_type }}
static: ${{ matrix.static }}
run_unit_tests: true
run_integration_tests: true
upload_clio_server: true
download_ccache: false
upload_ccache: false
expected_version: ${{ github.event_name == 'push' && github.ref_name || '' }}
release:
needs: build-and-test
uses: ./.github/workflows/reusable-release.yml
with:
overwrite_release: false
prerelease: ${{ contains(github.ref_name, '-') }}
title: "${{ github.ref_name }}"
version: "${{ github.ref_name }}"
header: >
${{ contains(github.ref_name, '-') && '> **Note:** Please remember that this is a release candidate and it is not recommended for production use.' || '' }}
generate_changelog: ${{ !contains(github.ref_name, '-') }}
draft: ${{ !contains(github.ref_name, '-') }}

View File

@@ -1,105 +0,0 @@
name: Reusable build and test
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
download_ccache:
description: Whether to download ccache from the cache
required: false
type: boolean
default: true
upload_ccache:
description: Whether to upload ccache to the cache
required: false
type: boolean
default: false
static:
description: Whether to build static binaries
required: true
type: boolean
default: true
run_unit_tests:
description: Whether to run unit tests
required: true
type: boolean
run_integration_tests:
description: Whether to run integration tests
required: true
type: boolean
default: false
upload_clio_server:
description: Whether to upload clio_server
required: true
type: boolean
targets:
description: Space-separated build target names
required: false
type: string
default: all
expected_version:
description: Expected version of the clio_server binary
required: false
type: string
default: ""
package:
description: Whether to generate Debian package
required: false
type: boolean
default: false
jobs:
build:
uses: ./.github/workflows/reusable-build.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
download_ccache: ${{ inputs.download_ccache }}
upload_ccache: ${{ inputs.upload_ccache }}
code_coverage: false
static: ${{ inputs.static }}
upload_clio_server: ${{ inputs.upload_clio_server }}
targets: ${{ inputs.targets }}
analyze_build_time: false
expected_version: ${{ inputs.expected_version }}
package: ${{ inputs.package }}
test:
needs: build
uses: ./.github/workflows/reusable-test.yml
with:
runs_on: ${{ inputs.runs_on }}
container: ${{ inputs.container }}
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
run_unit_tests: ${{ inputs.run_unit_tests }}
run_integration_tests: ${{ inputs.run_integration_tests }}

View File

@@ -1,245 +0,0 @@
name: Reusable build
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
download_ccache:
description: Whether to download ccache from the cache
required: false
type: boolean
default: true
upload_ccache:
description: Whether to upload ccache to the cache
required: false
type: boolean
default: false
code_coverage:
description: Whether to enable code coverage
required: true
type: boolean
static:
description: Whether to build static binaries
required: true
type: boolean
upload_clio_server:
description: Whether to upload clio_server
required: true
type: boolean
targets:
description: Space-separated build target names
required: true
type: string
analyze_build_time:
description: Whether to enable build time analysis
required: true
type: boolean
expected_version:
description: Expected version of the clio_server binary
required: false
type: string
default: ""
package:
description: Whether to generate Debian package
required: false
type: boolean
secrets:
CODECOV_TOKEN:
required: false
jobs:
build:
name: Build
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
steps:
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
# We need to fetch tags to have correct version in the release
# The workaround is based on https://github.com/actions/checkout/issues/1467
fetch-tags: true
ref: ${{ github.ref }}
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: ${{ !inputs.download_ccache }}
- name: Setup conan on macOS
if: ${{ runner.os == 'macOS' }}
shell: bash
run: ./.github/scripts/conan/init.sh
- name: Restore cache
if: ${{ inputs.download_ccache }}
uses: ./.github/actions/restore-cache
id: restore_cache
with:
conan_profile: ${{ inputs.conan_profile }}
ccache_dir: ${{ env.CCACHE_DIR }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
- name: Run CMake
uses: ./.github/actions/cmake
with:
conan_profile: ${{ inputs.conan_profile }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
static: ${{ inputs.static }}
time_trace: ${{ inputs.analyze_build_time }}
package: ${{ inputs.package }}
- name: Build Clio
uses: ./.github/actions/build-clio
with:
targets: ${{ inputs.targets }}
- name: Show build time analyze report
if: ${{ inputs.analyze_build_time }}
run: |
ClangBuildAnalyzer --all build/ build_time_report.bin
ClangBuildAnalyzer --analyze build_time_report.bin > build_time_report.txt
cat build_time_report.txt
shell: bash
- name: Upload build time analyze report
if: ${{ inputs.analyze_build_time }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: build_time_report_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build_time_report.txt
- name: Show ccache's statistics
if: ${{ inputs.download_ccache }}
shell: bash
id: ccache_stats
run: |
ccache -s > /tmp/ccache.stats
miss_rate=$(cat /tmp/ccache.stats | grep 'Misses' | head -n1 | sed 's/.*(\(.*\)%).*/\1/')
echo "miss_rate=${miss_rate}" >> $GITHUB_OUTPUT
cat /tmp/ccache.stats
- name: Strip unit_tests
if: ${{ !endsWith(inputs.conan_profile, 'san') && !inputs.code_coverage && !inputs.analyze_build_time }}
run: strip build/clio_tests
- name: Strip integration_tests
if: ${{ !endsWith(inputs.conan_profile, 'san') && !inputs.code_coverage && !inputs.analyze_build_time }}
run: strip build/clio_integration_tests
- name: Upload clio_server
if: ${{ inputs.upload_clio_server && !inputs.code_coverage && !inputs.analyze_build_time }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: clio_server_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_server
- name: Upload clio_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time && !inputs.package }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_tests
- name: Upload clio_integration_tests
if: ${{ !inputs.code_coverage && !inputs.analyze_build_time && !inputs.package }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/clio_integration_tests
- name: Upload Clio Linux package
if: ${{ inputs.package }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: clio_deb_package_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: build/*.deb
- name: Save cache
if: ${{ inputs.upload_ccache && github.ref == 'refs/heads/develop' }}
uses: ./.github/actions/save-cache
with:
conan_profile: ${{ inputs.conan_profile }}
ccache_dir: ${{ env.CCACHE_DIR }}
build_type: ${{ inputs.build_type }}
code_coverage: ${{ inputs.code_coverage }}
ccache_cache_hit: ${{ steps.restore_cache.outputs.ccache_cache_hit }}
ccache_cache_miss_rate: ${{ steps.ccache_stats.outputs.miss_rate }}
# This is run as part of the build job, because it requires the following:
# - source code
# - conan packages
# - .gcno files in build directory
#
# It's all available in the build job, but not in the test job
- name: Run code coverage
if: ${{ inputs.code_coverage }}
uses: ./.github/actions/code-coverage
- name: Verify expected version
if: ${{ inputs.expected_version != '' }}
shell: bash
env:
INPUT_EXPECTED_VERSION: ${{ inputs.expected_version }}
run: |
set -e
EXPECTED_VERSION="clio-${INPUT_EXPECTED_VERSION}"
actual_version=$(./build/clio_server --version)
if [[ "$actual_version" != "$EXPECTED_VERSION" ]]; then
echo "Expected version '${EXPECTED_VERSION}', but got '${actual_version}'"
exit 1
fi
# `codecov/codecov-action` will rerun `gcov` if it's available and build directory is present
# To prevent this from happening, we run this action in a separate workflow
#
# More info: https://github.com/XRPLF/clio/pull/2066
upload_coverage_report:
if: ${{ inputs.code_coverage }}
name: Codecov
needs: build
uses: ./.github/workflows/reusable-upload-coverage-report.yml
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -1,121 +0,0 @@
name: Make release
on:
workflow_call:
inputs:
overwrite_release:
description: "Overwrite the current release and tag"
required: true
type: boolean
prerelease:
description: "Create a prerelease"
required: true
type: boolean
title:
description: "Release title"
required: true
type: string
version:
description: "Release version"
required: true
type: string
header:
description: "Release notes header"
required: true
type: string
generate_changelog:
description: "Generate changelog"
required: true
type: boolean
draft:
description: "Create a draft release"
required: true
type: boolean
jobs:
release:
runs-on: heavy
container:
image: ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ github.token }}
permissions:
contents: write
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
path: release_artifacts
pattern: clio_server_*
- name: Create release notes
shell: bash
env:
RELEASE_HEADER: ${{ inputs.header }}
run: |
echo "# Release notes" > "${RUNNER_TEMP}/release_notes.md"
echo "" >> "${RUNNER_TEMP}/release_notes.md"
printf '%s\n' "${RELEASE_HEADER}" >> "${RUNNER_TEMP}/release_notes.md"
- name: Generate changelog
shell: bash
if: ${{ inputs.generate_changelog }}
run: |
LAST_TAG="$(gh release view --json tagName -q .tagName --repo XRPLF/clio)"
LAST_TAG_COMMIT="$(git rev-parse $LAST_TAG)"
BASE_COMMIT="$(git merge-base HEAD $LAST_TAG_COMMIT)"
git-cliff "${BASE_COMMIT}..HEAD" --ignore-tags "nightly|-b|-rc"
cat CHANGELOG.md >> "${RUNNER_TEMP}/release_notes.md"
- name: Prepare release artifacts
shell: bash
run: .github/scripts/prepare-release-artifacts.sh release_artifacts
- name: Upload release notes
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: release_notes_${{ inputs.version }}
path: "${RUNNER_TEMP}/release_notes.md"
- name: Remove current release and tag
if: ${{ github.event_name != 'pull_request' && inputs.overwrite_release }}
shell: bash
env:
RELEASE_VERSION: ${{ inputs.version }}
run: |
gh release delete "${RELEASE_VERSION}" --yes || true
git push origin :"${RELEASE_VERSION}" || true
- name: Publish release
if: ${{ github.event_name != 'pull_request' }}
shell: bash
env:
RELEASE_VERSION: ${{ inputs.version }}
PRERELEASE_OPTION: ${{ inputs.prerelease && '--prerelease' || '' }}
RELEASE_TITLE: ${{ inputs.title }}
DRAFT_OPTION: ${{ inputs.draft && '--draft' || '' }}
run: |
gh release create "${RELEASE_VERSION}" \
${PRERELEASE_OPTION} \
--title "${RELEASE_TITLE}" \
--target "${GITHUB_SHA}" \
${DRAFT_OPTION} \
--notes-file "${RUNNER_TEMP}/release_notes.md" \
./release_artifacts/clio_server*

View File

@@ -1,160 +0,0 @@
name: Reusable test
on:
workflow_call:
inputs:
runs_on:
description: Runner to run the job on
required: true
type: string
container:
description: "The container object as a JSON string (leave empty to run natively)"
required: true
type: string
conan_profile:
description: Conan profile to use
required: true
type: string
build_type:
description: Build type
required: true
type: string
run_unit_tests:
description: Whether to run unit tests
required: true
type: boolean
run_integration_tests:
description: Whether to run integration tests
required: true
type: boolean
jobs:
unit_tests:
name: Unit testing
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: ${{ inputs.run_unit_tests }}
env:
# TODO: remove completely when we have fixed all currently existing issues with sanitizers
SANITIZER_IGNORE_ERRORS: ${{ endsWith(inputs.conan_profile, '.tsan') || (inputs.conan_profile == 'gcc.asan' && inputs.build_type == 'Release') }}
steps:
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: clio_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
- name: Make clio_tests executable
shell: bash
run: chmod +x ./clio_tests
- name: Run clio_tests (regular)
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'false' }}
run: ./clio_tests
- name: Run clio_tests (sanitizer errors ignored)
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' }}
run: ./.github/scripts/execute-tests-under-sanitizer.sh ./clio_tests
- name: Check for sanitizer report
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' }}
shell: bash
id: check_report
run: |
if ls .sanitizer-report/* 1> /dev/null 2>&1; then
echo "found_report=true" >> $GITHUB_OUTPUT
else
echo "found_report=false" >> $GITHUB_OUTPUT
fi
- name: Upload sanitizer report
if: ${{ env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true' }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: sanitizer_report_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
path: .sanitizer-report/*
include-hidden-files: true
- name: Create an issue
if: ${{ false && env.SANITIZER_IGNORE_ERRORS == 'true' && steps.check_report.outputs.found_report == 'true' }}
uses: ./.github/actions/create-issue
env:
GH_TOKEN: ${{ github.token }}
with:
labels: "bug"
title: "[${{ inputs.conan_profile }}] reported issues"
body: >
Clio tests failed one or more sanitizer checks when built with ${{ inputs.conan_profile }}`.
Workflow: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/
Reports are available as artifacts.
integration_tests:
name: Integration testing
runs-on: ${{ inputs.runs_on }}
container: ${{ inputs.container != '' && fromJson(inputs.container) || null }}
if: ${{ inputs.run_integration_tests }}
services:
scylladb:
image: ${{ inputs.container != '' && 'scylladb/scylla' || '' }}
options: >-
--health-cmd "cqlsh -e 'describe cluster'"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Cleanup workspace
if: ${{ runner.os == 'macOS' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@ea9970b7c211b18f4c8bcdb28c29f5711752029f
- name: Spin up scylladb
if: ${{ runner.os == 'macOS' }}
timeout-minutes: 3
run: |
docker rm --force scylladb || true
docker run \
--detach \
--name scylladb \
--health-cmd "cqlsh -e 'describe cluster'" \
--health-interval 10s \
--health-timeout 5s \
--health-retries 5 \
--publish 9042:9042 \
--memory 16G \
scylladb/scylla
until [ "$(docker inspect -f '{{.State.Health.Status}}' scylladb)" == "healthy" ]; do
sleep 5
done
- uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: clio_integration_tests_${{ runner.os }}_${{ inputs.build_type }}_${{ inputs.conan_profile }}
- name: Run clio_integration_tests
run: |
chmod +x ./clio_integration_tests
./clio_integration_tests ${{ runner.os != 'macOS' && '--backend_host=scylladb' || '' }}
- name: Show docker logs and stop scylladb
if: ${{ always() && runner.os == 'macOS' }}
run: |
docker logs scylladb
docker rm --force scylladb || true

View File

@@ -1,32 +0,0 @@
name: Upload report
on:
workflow_call:
secrets:
CODECOV_TOKEN:
required: true
jobs:
upload_report:
name: Upload report
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Download report artifact
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: coverage-report.xml
path: build
- name: Upload coverage report
if: ${{ hashFiles('build/coverage_report.xml') != '' }}
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
files: build/coverage_report.xml
fail_ci_if_error: true
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -1,57 +0,0 @@
name: Run tests with sanitizers
on:
schedule:
- cron: "0 4 * * 1-5"
workflow_dispatch:
pull_request:
paths:
- .github/workflows/sanitizers.yml
- .github/workflows/reusable-build-test.yml
- .github/workflows/reusable-build.yml
- .github/workflows/reusable-test.yml
- ".github/actions/**"
- "!.github/actions/build-docker-image/**"
- "!.github/actions/create-issue/**"
- .github/scripts/execute-tests-under-sanitizer.sh
- CMakeLists.txt
- conanfile.py
- conan.lock
- "cmake/**"
# We don't run sanitizer on code change, because it takes too long
# - "src/**"
# - "tests/**"
concurrency:
# Only cancel in-progress jobs or runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-and-test:
name: Build and Test
strategy:
fail-fast: false
matrix:
compiler: [gcc, clang]
sanitizer_ext: [.asan, .tsan, .ubsan]
build_type: [Release, Debug]
uses: ./.github/workflows/reusable-build-test.yml
with:
runs_on: heavy
container: '{ "image": "ghcr.io/xrplf/clio-ci:b2be4b51d1d81548ca48e2f2b8f67356b880c96d" }'
download_ccache: false
upload_ccache: false
conan_profile: ${{ matrix.compiler }}${{ matrix.sanitizer_ext }}
build_type: ${{ matrix.build_type }}
static: false
# Currently, both gcc.tsan and clang.tsan unit tests hang
run_unit_tests: ${{ matrix.sanitizer_ext != '.tsan' }}
run_integration_tests: false
upload_clio_server: false
targets: clio_tests clio_integration_tests

View File

@@ -1,360 +0,0 @@
name: Update CI docker image
on:
pull_request:
paths:
- .github/workflows/update-docker-ci.yml
- ".github/actions/build-docker-image/**"
- "docker/**"
- "!docker/clio/**"
- "!docker/develop/**"
push:
branches: [develop]
paths:
- .github/workflows/update-docker-ci.yml
- ".github/actions/build-docker-image/**"
- "docker/**"
- "!docker/clio/**"
- "!docker/develop/**"
workflow_dispatch:
concurrency:
# Only matches runs for the current workflow - matches against branch & tags
group: ${{ github.workflow }}-${{ github.ref }}
# We want to execute all builds sequentially in develop
cancel-in-progress: false
env:
CLANG_MAJOR_VERSION: 19
GCC_MAJOR_VERSION: 15
GCC_VERSION: 15.2.0
jobs:
repo:
name: Calculate repo name
runs-on: ubuntu-latest
outputs:
GHCR_REPO: ${{ steps.set-ghcr-repo.outputs.GHCR_REPO }}
steps:
- name: Set GHCR_REPO
id: set-ghcr-repo
run: |
echo "GHCR_REPO=$(echo ghcr.io/${{ github.repository_owner }} | tr '[:upper:]' '[:lower:]')" >> ${GITHUB_OUTPUT}
gcc-amd64:
name: Build and push GCC docker image (amd64)
runs-on: heavy
needs: repo
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/gcc/**"
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/gcc
tags: |
type=raw,value=amd64-latest
type=raw,value=amd64-${{ env.GCC_MAJOR_VERSION }}
type=raw,value=amd64-${{ env.GCC_VERSION }}
type=raw,value=amd64-${{ github.sha }}
platforms: linux/amd64
build_args: |
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
dockerhub_description: GCC compiler for XRPLF/clio.
gcc-arm64:
name: Build and push GCC docker image (arm64)
runs-on: heavy-arm64
needs: repo
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: "docker/compilers/gcc/**"
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/gcc
tags: |
type=raw,value=arm64-latest
type=raw,value=arm64-${{ env.GCC_MAJOR_VERSION }}
type=raw,value=arm64-${{ env.GCC_VERSION }}
type=raw,value=arm64-${{ github.sha }}
platforms: linux/arm64
build_args: |
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_gcc' || '' }}
dockerhub_description: GCC compiler for XRPLF/clio.
gcc-merge:
name: Merge and push multi-arch GCC docker image
runs-on: heavy
needs: [repo, gcc-amd64, gcc-arm64]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/gcc/**"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Login to GitHub Container Registry
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to DockerHub
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USER }}
password: ${{ secrets.DOCKERHUB_PW }}
- name: Create and push multi-arch manifest
if: ${{ github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true' }}
run: |
push_image() {
image=$1
docker buildx imagetools create \
-t $image:latest \
-t $image:${{ env.GCC_MAJOR_VERSION }} \
-t $image:${{ env.GCC_VERSION }} \
-t $image:${{ github.sha }} \
$image:arm64-latest \
$image:amd64-latest
}
push_image ${{ needs.repo.outputs.GHCR_REPO }}/clio-gcc
if [[ ${{ github.repository_owner }} == 'XRPLF' ]]; then
push_image rippleci/clio_clang
fi
clang:
name: Build and push Clang docker image
runs-on: heavy
needs: repo
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/compilers/clang/**"
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-clang
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_clang' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/compilers/clang
tags: |
type=raw,value=latest
type=raw,value=${{ env.CLANG_MAJOR_VERSION }}
type=raw,value=${{ github.sha }}
platforms: linux/amd64,linux/arm64
build_args: |
CLANG_MAJOR_VERSION=${{ env.CLANG_MAJOR_VERSION }}
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_clang' || '' }}
dockerhub_description: Clang compiler for XRPLF/clio.
tools-amd64:
name: Build and push tools docker image (amd64)
runs-on: heavy
needs: [repo, gcc-merge]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/tools/**"
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-tools
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/tools
tags: |
type=raw,value=amd64-latest
type=raw,value=amd64-${{ github.sha }}
platforms: linux/amd64
build_args: |
GHCR_REPO=${{ needs.repo.outputs.GHCR_REPO }}
GCC_VERSION=${{ env.GCC_VERSION }}
tools-arm64:
name: Build and push tools docker image (arm64)
runs-on: heavy-arm64
needs: [repo, gcc-merge]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: "docker/tools/**"
- uses: ./.github/actions/build-docker-image
if: ${{ steps.changed-files.outputs.any_changed == 'true' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-tools
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/tools
tags: |
type=raw,value=arm64-latest
type=raw,value=arm64-${{ github.sha }}
platforms: linux/arm64
build_args: |
GHCR_REPO=${{ needs.repo.outputs.GHCR_REPO }}
GCC_VERSION=${{ env.GCC_VERSION }}
tools-merge:
name: Merge and push multi-arch tools docker image
runs-on: heavy
needs: [repo, tools-amd64, tools-arm64]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
with:
files: "docker/tools/**"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Login to GitHub Container Registry
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Create and push multi-arch manifest
if: ${{ github.event_name != 'pull_request' && steps.changed-files.outputs.any_changed == 'true' }}
run: |
image=${{ needs.repo.outputs.GHCR_REPO }}/clio-tools
docker buildx imagetools create \
-t $image:latest \
-t $image:${{ github.sha }} \
$image:arm64-latest \
$image:amd64-latest
pre-commit:
name: Build and push pre-commit docker image
runs-on: heavy
needs: [repo, tools-merge]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-pre-commit
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/pre-commit
tags: |
type=raw,value=latest
type=raw,value=${{ github.sha }}
platforms: linux/amd64,linux/arm64
build_args: |
GHCR_REPO=${{ needs.repo.outputs.GHCR_REPO }}
ci:
name: Build and push CI docker image
runs-on: heavy
needs: [repo, gcc-merge, clang, tools-merge]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: ./.github/actions/build-docker-image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_PW: ${{ secrets.DOCKERHUB_PW }}
with:
images: |
${{ needs.repo.outputs.GHCR_REPO }}/clio-ci
${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_ci' || '' }}
push_image: ${{ github.event_name != 'pull_request' }}
directory: docker/ci
tags: |
type=raw,value=latest
type=raw,value=gcc_${{ env.GCC_MAJOR_VERSION }}_clang_${{ env.CLANG_MAJOR_VERSION }}
type=raw,value=${{ github.sha }}
platforms: linux/amd64,linux/arm64
build_args: |
GHCR_REPO=${{ needs.repo.outputs.GHCR_REPO }}
CLANG_MAJOR_VERSION=${{ env.CLANG_MAJOR_VERSION }}
GCC_MAJOR_VERSION=${{ env.GCC_MAJOR_VERSION }}
GCC_VERSION=${{ env.GCC_VERSION }}
dockerhub_repo: ${{ github.repository_owner == 'XRPLF' && 'rippleci/clio_ci' || '' }}
dockerhub_description: CI image for XRPLF/clio.

View File

@@ -1,104 +0,0 @@
name: Upload Conan Dependencies
on:
schedule:
- cron: "0 9 * * 1-5"
workflow_dispatch:
inputs:
force_source_build:
description: "Force source build of all dependencies"
required: false
default: false
type: boolean
force_upload:
description: "Force upload of all dependencies"
required: false
default: false
type: boolean
pull_request:
branches: [develop]
paths:
- .github/workflows/upload-conan-deps.yml
- .github/actions/conan/action.yml
- ".github/scripts/conan/**"
- conanfile.py
- conan.lock
push:
branches: [develop]
paths:
- .github/workflows/upload-conan-deps.yml
- .github/actions/conan/action.yml
- ".github/scripts/conan/**"
- conanfile.py
- conan.lock
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
generate-matrix:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.set-matrix.outputs.matrix }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Calculate conan matrix
id: set-matrix
run: .github/scripts/conan/generate_matrix.py >> "${GITHUB_OUTPUT}"
upload-conan-deps:
name: Build ${{ matrix.compiler }}${{ matrix.sanitizer_ext }} ${{ matrix.build_type }}
needs: generate-matrix
strategy:
fail-fast: false
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
runs-on: ${{ matrix.os }}
container: ${{ matrix.container != '' && fromJson(matrix.container) || null }}
env:
CONAN_PROFILE: ${{ matrix.compiler }}${{ matrix.sanitizer_ext }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@7951b682e5a2973b28b0719a72f01fc4b0d0c34f
with:
disable_ccache: true
- name: Setup conan on macOS
if: ${{ runner.os == 'macOS' }}
shell: bash
run: ./.github/scripts/conan/init.sh
- name: Show conan profile
run: conan profile show --profile:all ${{ env.CONAN_PROFILE }}
- name: Run conan
uses: ./.github/actions/conan
with:
conan_profile: ${{ env.CONAN_PROFILE }}
# We check that everything builds fine from source on scheduled runs
# But we do build and upload packages with build=missing by default
force_conan_source_build: ${{ github.event_name == 'schedule' || github.event.inputs.force_source_build == 'true' }}
build_type: ${{ matrix.build_type }}
- name: Login to Conan
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
run: conan remote login -p ${{ secrets.CONAN_PASSWORD }} xrplf ${{ secrets.CONAN_USERNAME }}
- name: Upload Conan packages
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' && github.event_name != 'schedule' }}
env:
FORCE_OPTION: ${{ github.event.inputs.force_upload == 'true' && '--force' || '' }}
run: conan upload "*" -r=xrplf --confirm ${FORCE_OPTION}

8
.gitignore vendored
View File

@@ -1,11 +1,5 @@
*clio*.log
/build*/
.devcontainer
.build
.cache
build/
.vscode
.python-version
.DS_Store
.sanitizer-report
CMakeUserPresets.json
config.json

View File

@@ -1,6 +0,0 @@
---
ignored:
- DL3003
- DL3007
- DL3008
- DL3013

View File

@@ -1,6 +0,0 @@
# Default state for all rules
default: true
# MD013/line-length - Line length
MD013:
line_length: 1000

View File

@@ -1,123 +0,0 @@
---
# pre-commit is a tool to perform a predefined set of tasks manually and/or
# automatically before git commits are made.
#
# Config reference: https://pre-commit.com/#pre-commit-configyaml---top-level
#
# Common tasks
#
# - Run on all files: pre-commit run --all-files
# - Register git hooks: pre-commit install --hook-type pre-commit --hook-type pre-push
#
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
exclude: ^(docs/doxygen-awesome-theme/|conan\.lock$)
repos:
# `pre-commit sample-config` default hooks
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: 3e8a8703264a2f4a69428a0aa4dcb512790b2c8c # frozen: v6.0.0
hooks:
- id: check-added-large-files
- id: check-executables-have-shebangs
- id: check-shebang-scripts-are-executable
- id: end-of-file-fixer
- id: trailing-whitespace
# Autoformat: YAML, JSON, Markdown, etc.
- repo: https://github.com/rbubley/mirrors-prettier
rev: 5ba47274f9b181bce26a5150a725577f3c336011 # frozen: v3.6.2
hooks:
- id: prettier
- repo: https://github.com/igorshubovych/markdownlint-cli
rev: 192ad822316c3a22fb3d3cc8aa6eafa0b8488360 # frozen: v0.45.0
hooks:
- id: markdownlint-fix
exclude: LICENSE.md
- repo: https://github.com/hadolint/hadolint
rev: 4e697ba704fd23b2409b947a319c19c3ee54d24f # frozen: v2.14.0
hooks:
- id: hadolint-docker
# hadolint-docker is a special hook that runs hadolint in a Docker container
# Docker is not installed in the environment where pre-commit is run
stages: [manual]
entry: hadolint/hadolint:v2.14.0 hadolint
- repo: https://github.com/codespell-project/codespell
rev: 63c8f8312b7559622c0d82815639671ae42132ac # frozen: v2.4.1
hooks:
- id: codespell
args:
[
--write-changes,
--ignore-words=pre-commit-hooks/codespell_ignore.txt,
]
# Running some C++ hooks before clang-format
# to ensure that the style is consistent.
- repo: local
hooks:
- id: json-in-cpp
name: Fix JSON style in C++
entry: pre-commit-hooks/json_in_cpp.py
types: [c++]
language: python
exclude: |
(?x)^(
tests/unit/etl/SubscriptionSourceTests.cpp|
tests/unit/web/ServerTests.cpp|
tests/unit/web/impl/ErrorHandlingTests.cpp|
tests/unit/web/ng/ServerTests.cpp|
tests/unit/web/ng/impl/ErrorHandlingTests.cpp
)$
- id: fix-local-includes
name: Fix Local Includes
entry: pre-commit-hooks/fix-local-includes.sh
types: [c++]
language: script
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: 719856d56a62953b8d2839fb9e851f25c3cfeef8 # frozen: v21.1.2
hooks:
- id: clang-format
args: [--style=file]
types: [c++]
- repo: https://github.com/cheshirekow/cmake-format-precommit
rev: e2c2116d86a80e72e7146a06e68b7c228afc6319 # frozen: v0.6.13
hooks:
- id: cmake-format
additional_dependencies: [PyYAML]
- repo: local
hooks:
- id: check-no-h-files
name: No .h files
entry: There should be no .h files in this repository
language: fail
files: \.h$
- repo: local
hooks:
- id: gofmt
name: Go Format
entry: pre-commit-hooks/run-go-fmt.sh
types: [go]
language: golang
description: "Runs `gofmt`, requires golang"
- id: check-docs
name: Check Doxygen Documentation
entry: pre-commit-hooks/check-doxygen-docs.sh
types: [text]
language: script
pass_filenames: false
- id: verify-commits
name: Verify Commits
entry: pre-commit-hooks/verify-commits.sh
always_run: true
stages: [pre-push]
language: script
pass_filenames: false

15
CMake/ClioVersion.cmake Normal file
View File

@@ -0,0 +1,15 @@
#[===================================================================[
read version from source
#]===================================================================]
file (STRINGS src/main/impl/Build.cpp BUILD_INFO)
foreach (line_ ${BUILD_INFO})
if (line_ MATCHES "versionString[ ]*=[ ]*\"(.+)\"")
set (clio_version ${CMAKE_MATCH_1})
endif ()
endforeach ()
if (clio_version)
message (STATUS "clio version: ${clio_version}")
else ()
message (FATAL_ERROR "unable to determine clio version")
endif ()

6
CMake/deps/Boost.cmake Normal file
View File

@@ -0,0 +1,6 @@
set(Boost_USE_STATIC_LIBS ON)
set(Boost_USE_STATIC_RUNTIME ON)
find_package(Boost 1.75 COMPONENTS filesystem log_setup log thread system REQUIRED)
target_link_libraries(clio PUBLIC ${Boost_LIBRARIES})

View File

@@ -0,0 +1,24 @@
From 5cd9d09d960fa489a0c4379880cd7615b1c16e55 Mon Sep 17 00:00:00 2001
From: CJ Cobb <ccobb@ripple.com>
Date: Wed, 10 Aug 2022 12:30:01 -0400
Subject: [PATCH] Remove bitset operator !=
---
src/ripple/protocol/Feature.h | 1 -
1 file changed, 1 deletion(-)
diff --git a/src/ripple/protocol/Feature.h b/src/ripple/protocol/Feature.h
index b3ecb099b..6424be411 100644
--- a/src/ripple/protocol/Feature.h
+++ b/src/ripple/protocol/Feature.h
@@ -126,7 +126,6 @@ class FeatureBitset : private std::bitset<detail::numFeatures>
public:
using base::bitset;
using base::operator==;
- using base::operator!=;
using base::all;
using base::any;
--
2.32.0

View File

@@ -0,0 +1,11 @@
include(CheckIncludeFileCXX)
check_include_file_cxx("source_location" SOURCE_LOCATION_AVAILABLE)
if(SOURCE_LOCATION_AVAILABLE)
target_compile_definitions(clio PUBLIC "HAS_SOURCE_LOCATION")
endif()
check_include_file_cxx("experimental/source_location" EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
if(EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
target_compile_definitions(clio PUBLIC "HAS_EXPERIMENTAL_SOURCE_LOCATION")
endif()

151
CMake/deps/cassandra.cmake Normal file
View File

@@ -0,0 +1,151 @@
find_package(ZLIB REQUIRED)
find_library(cassandra NAMES cassandra)
if(NOT cassandra)
message("System installed Cassandra cpp driver not found. Will build")
find_library(zlib NAMES zlib1g-dev zlib-devel zlib z)
if(NOT zlib)
message("zlib not found. will build")
add_library(zlib STATIC IMPORTED GLOBAL)
ExternalProject_Add(zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.12
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}z.a
)
ExternalProject_Get_Property (zlib_src SOURCE_DIR)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${zlib_src_SOURCE_DIR}/include)
set_target_properties (zlib PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}z.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(zlib zlib_src)
file(TO_CMAKE_PATH "${zlib_src_SOURCE_DIR}" zlib_src_SOURCE_DIR)
endif()
find_library(krb5 NAMES krb5-dev libkrb5-dev)
if(NOT krb5)
message("krb5 not found. will build")
add_library(krb5 STATIC IMPORTED GLOBAL)
ExternalProject_Add(krb5_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/krb5/krb5.git
GIT_TAG krb5-1.20
UPDATE_COMMAND ""
CONFIGURE_COMMAND autoreconf src && CFLAGS=-fcommon ./src/configure --enable-static --disable-shared
BUILD_IN_SOURCE 1
BUILD_COMMAND make
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <SOURCE_DIR>/lib/${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a
)
message(${ep_lib_prefix}/krb5.a)
message(${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a)
ExternalProject_Get_Property (krb5_src SOURCE_DIR)
ExternalProject_Get_Property (krb5_src BINARY_DIR)
set (krb5_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${krb5_src_SOURCE_DIR}/include)
set_target_properties (krb5 PROPERTIES
IMPORTED_LOCATION
${SOURCE_DIR}/lib/${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(krb5 krb5_src)
file(TO_CMAKE_PATH "${krb5_src_SOURCE_DIR}" krb5_src_SOURCE_DIR)
endif()
find_library(libuv1 NAMES uv1 libuv1 liubuv1-dev libuv1:amd64)
if(NOT libuv1)
message("libuv1 not found, will build")
add_library(libuv1 STATIC IMPORTED GLOBAL)
ExternalProject_Add(libuv_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libuv/libuv.git
GIT_TAG v1.44.1
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}uv_a.a
)
ExternalProject_Get_Property (libuv_src SOURCE_DIR)
ExternalProject_Get_Property (libuv_src BINARY_DIR)
set (libuv_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${libuv_src_SOURCE_DIR}/include)
set_target_properties (libuv1 PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}uv_a.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(libuv1 libuv_src)
file(TO_CMAKE_PATH "${libuv_src_SOURCE_DIR}" libuv_src_SOURCE_DIR)
endif()
add_library (cassandra STATIC IMPORTED GLOBAL)
ExternalProject_Add(cassandra_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/datastax/cpp-driver.git
GIT_TAG 2.16.2
CMAKE_ARGS
-DLIBUV_ROOT_DIR=${BINARY_DIR}
-DLIBUV_INCLUDE_DIR=${SOURCE_DIR}/include
-DCASS_BUILD_STATIC=ON
-DCASS_BUILD_SHARED=OFF
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a
)
ExternalProject_Get_Property (cassandra_src SOURCE_DIR)
ExternalProject_Get_Property (cassandra_src BINARY_DIR)
set (cassandra_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${cassandra_src_SOURCE_DIR}/include)
set_target_properties (cassandra PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
message("cass dirs")
message(${BINARY_DIR})
message(${SOURCE_DIR})
message(${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a)
add_dependencies(cassandra cassandra_src)
if(NOT libuv1)
ExternalProject_Add_StepDependencies(cassandra_src build libuv1)
target_link_libraries(cassandra INTERFACE libuv1)
else()
target_link_libraries(cassandra INTERFACE ${libuv1})
endif()
if(NOT krb5)
ExternalProject_Add_StepDependencies(cassandra_src build krb5)
target_link_libraries(cassandra INTERFACE krb5)
else()
target_link_libraries(cassandra INTERFACE ${krb5})
endif()
if(NOT zlib)
ExternalProject_Add_StepDependencies(cassandra_src build zlib)
target_link_libraries(cassandra INTERFACE zlib)
else()
target_link_libraries(cassandra INTERFACE ${zlib})
endif()
set(OPENSSL_USE_STATIC_LIBS TRUE)
find_package(OpenSSL REQUIRED)
target_link_libraries(cassandra INTERFACE OpenSSL::SSL)
file(TO_CMAKE_PATH "${cassandra_src_SOURCE_DIR}" cassandra_src_SOURCE_DIR)
target_link_libraries(clio PUBLIC cassandra)
else()
message("Found system installed cassandra cpp driver")
message(${cassandra})
find_path(cassandra_includes NAMES cassandra.h REQUIRED)
target_link_libraries (clio PUBLIC ${cassandra})
target_include_directories(clio INTERFACE ${cassandra_includes})
endif()

20
CMake/deps/gtest.cmake Normal file
View File

@@ -0,0 +1,20 @@
FetchContent_Declare(
googletest
URL https://github.com/google/googletest/archive/609281088cfefc76f9d0ce82e1ff6c30cc3591e5.zip
)
FetchContent_GetProperties(googletest)
if(NOT googletest_POPULATED)
FetchContent_Populate(googletest)
add_subdirectory(${googletest_SOURCE_DIR} ${googletest_BINARY_DIR} EXCLUDE_FROM_ALL)
endif()
target_link_libraries(clio_tests PUBLIC clio gmock_main)
target_include_directories(clio_tests PRIVATE unittests)
enable_testing()
include(GoogleTest)
gtest_discover_tests(clio_tests)

20
CMake/deps/rippled.cmake Normal file
View File

@@ -0,0 +1,20 @@
set(RIPPLED_REPO "https://github.com/ripple/rippled.git")
set(RIPPLED_BRANCH "1.9.2")
set(NIH_CACHE_ROOT "${CMAKE_CURRENT_BINARY_DIR}" CACHE INTERNAL "")
set(patch_command ! grep operator!= src/ripple/protocol/Feature.h || git apply < ${CMAKE_CURRENT_SOURCE_DIR}/CMake/deps/Remove-bitset-operator.patch)
message(STATUS "Cloning ${RIPPLED_REPO} branch ${RIPPLED_BRANCH}")
FetchContent_Declare(rippled
GIT_REPOSITORY "${RIPPLED_REPO}"
GIT_TAG "${RIPPLED_BRANCH}"
GIT_SHALLOW ON
PATCH_COMMAND "${patch_command}"
)
FetchContent_GetProperties(rippled)
if(NOT rippled_POPULATED)
FetchContent_Populate(rippled)
add_subdirectory(${rippled_SOURCE_DIR} ${rippled_BINARY_DIR} EXCLUDE_FROM_ALL)
endif()
target_link_libraries(clio PUBLIC xrpl_core grpc_pbufs)
target_include_directories(clio PUBLIC ${rippled_SOURCE_DIR}/src ) # TODO: Seems like this shouldn't be needed?

View File

@@ -0,0 +1,17 @@
[Unit]
Description=Clio XRPL API server
Documentation=https://github.com/XRPLF/clio.git
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
ExecStart=@CLIO_INSTALL_DIR@/bin/clio_server @CLIO_INSTALL_DIR@/etc/config.json
Restart=on-failure
User=clio
Group=clio
LimitNOFILE=65536
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,16 @@
set(CLIO_INSTALL_DIR "/opt/clio")
set(CMAKE_INSTALL_PREFIX ${CLIO_INSTALL_DIR})
install(TARGETS clio_server DESTINATION bin)
# install(TARGETS clio_tests DESTINATION bin) # NOTE: Do we want to install the tests?
#install(FILES example-config.json DESTINATION etc RENAME config.json)
file(READ example-config.json config)
string(REGEX REPLACE "./clio_log" "/var/log/clio/" config "${config}")
file(WRITE ${CMAKE_BINARY_DIR}/install-config.json "${config}")
install(FILES ${CMAKE_BINARY_DIR}/install-config.json DESTINATION etc RENAME config.json)
configure_file("${CMAKE_SOURCE_DIR}/CMake/install/clio.service.in" "${CMAKE_BINARY_DIR}/clio.service")
install(FILES "${CMAKE_BINARY_DIR}/clio.service" DESTINATION /lib/systemd/system)

6
CMake/settings.cmake Normal file
View File

@@ -0,0 +1,6 @@
target_compile_options(clio
PUBLIC -Wall
-Werror
-Wno-narrowing
-Wno-deprecated-declarations
-Wno-dangling-else)

View File

@@ -1,97 +1,125 @@
cmake_minimum_required(VERSION 3.20)
cmake_minimum_required(VERSION 3.16.3)
project(clio VERSION ${CLIO_VERSION} HOMEPAGE_URL "https://github.com/XRPLF/clio"
DESCRIPTION "An XRP Ledger API Server"
)
project(clio)
# =========================== Options ====================================== #
option(verbose "Verbose build" FALSE)
option(tests "Build unit tests" FALSE)
option(integration_tests "Build integration tests" FALSE)
option(benchmark "Build benchmarks" FALSE)
option(docs "Generate doxygen docs" FALSE)
option(coverage "Build test coverage report" FALSE)
option(package "Create distribution packages" FALSE)
option(lint "Run clang-tidy checks during compilation" FALSE)
option(static "Statically linked Clio" FALSE)
option(snapshot "Build snapshot tool" FALSE)
option(time_trace "Build using -ftime-trace to create compiler trace reports" FALSE)
if(CMAKE_CXX_COMPILER_VERSION VERSION_LESS 11)
message(FATAL_ERROR "GCC 11+ required for building clio")
endif()
# ========================================================================== #
set(san "" CACHE STRING "Add sanitizer instrumentation")
set(CMAKE_EXPORT_COMPILE_COMMANDS TRUE)
set_property(CACHE san PROPERTY STRINGS ";undefined;memory;address;thread")
# ========================================================================== #
option(BUILD_TESTS "Build tests" TRUE)
set(CMAKE_MODULE_PATH "${PROJECT_SOURCE_DIR}/cmake" ${CMAKE_MODULE_PATH})
# Include required modules
include(Ccache)
include(CheckCXXCompilerFlag)
include(ClangTidy)
include(Linker)
add_library(clio_options INTERFACE)
target_compile_features(clio_options INTERFACE cxx_std_23) # Clio needs c++23 but deps can remain c++20 for now
target_include_directories(clio_options INTERFACE ${CMAKE_SOURCE_DIR}/src)
if (verbose)
option(VERBOSE "Verbose build" TRUE)
if(VERBOSE)
set(CMAKE_VERBOSE_MAKEFILE TRUE)
endif ()
set(FETCHCONTENT_QUIET FALSE CACHE STRING "Verbose FetchContent()")
endif()
# Clio tweaks and checks
include(CheckCompiler)
include(Settings)
include(SourceLocation)
if(NOT GIT_COMMIT_HASH)
if(VERBOSE)
message("GIT_COMMIT_HASH not provided...looking for git")
endif()
find_package(Git)
if(Git_FOUND)
execute_process(COMMAND ${GIT_EXECUTABLE} rev-parse --short HEAD
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE git-ref)
if(git-ref)
set(BUILD "${git-ref}")
message(STATUS "Build version: ${BUILD}")
add_definitions(-DCLIO_BUILD="${BUILD}")
endif()
endif()
endif() #git
if(PACKAGING)
add_definitions(-DPKG=1)
endif()
# Clio deps
include(deps/libxrpl)
include(deps/Boost)
include(deps/OpenSSL)
include(deps/Threads)
include(deps/libfmt)
include(deps/cassandra)
include(deps/libbacktrace)
include(deps/spdlog)
add_library(clio)
target_compile_features(clio PUBLIC cxx_std_20)
target_include_directories(clio PUBLIC src)
add_subdirectory(src)
add_subdirectory(tests)
include(FetchContent)
include(ExternalProject)
include(CMake/settings.cmake)
include(CMake/ClioVersion.cmake)
include(CMake/deps/rippled.cmake)
include(CMake/deps/Boost.cmake)
include(CMake/deps/cassandra.cmake)
include(CMake/deps/SourceLocation.cmake)
if (benchmark)
add_subdirectory(benchmarks)
endif ()
target_sources(clio PRIVATE
## Main
src/main/impl/Build.cpp
## Backend
src/backend/BackendInterface.cpp
src/backend/CassandraBackend.cpp
src/backend/SimpleCache.cpp
## ETL
src/etl/ETLSource.cpp
src/etl/ProbingETLSource.cpp
src/etl/NFTHelpers.cpp
src/etl/ReportingETL.cpp
## Subscriptions
src/subscriptions/SubscriptionManager.cpp
## RPC
src/rpc/Errors.cpp
src/rpc/RPC.cpp
src/rpc/RPCHelpers.cpp
src/rpc/Counters.cpp
src/rpc/WorkQueue.cpp
## RPC Methods
# Account
src/rpc/handlers/AccountChannels.cpp
src/rpc/handlers/AccountCurrencies.cpp
src/rpc/handlers/AccountInfo.cpp
src/rpc/handlers/AccountLines.cpp
src/rpc/handlers/AccountOffers.cpp
src/rpc/handlers/AccountObjects.cpp
src/rpc/handlers/GatewayBalances.cpp
src/rpc/handlers/NoRippleCheck.cpp
# NFT
src/rpc/handlers/NFTHistory.cpp
src/rpc/handlers/NFTInfo.cpp
src/rpc/handlers/NFTOffers.cpp
# Ledger
src/rpc/handlers/Ledger.cpp
src/rpc/handlers/LedgerData.cpp
src/rpc/handlers/LedgerEntry.cpp
src/rpc/handlers/LedgerRange.cpp
# Transaction
src/rpc/handlers/Tx.cpp
src/rpc/handlers/TransactionEntry.cpp
src/rpc/handlers/AccountTx.cpp
# Dex
src/rpc/handlers/BookChanges.cpp
src/rpc/handlers/BookOffers.cpp
# Payment Channel
src/rpc/handlers/ChannelAuthorize.cpp
src/rpc/handlers/ChannelVerify.cpp
# Subscribe
src/rpc/handlers/Subscribe.cpp
# Server
src/rpc/handlers/ServerInfo.cpp
# Utilities
src/rpc/handlers/Random.cpp
src/config/Config.cpp
src/log/Logger.cpp
src/util/Taggable.cpp)
# Enable selected sanitizer if enabled via `san`
if (san)
set(SUPPORTED_SANITIZERS "address" "thread" "memory" "undefined")
if (NOT san IN_LIST SUPPORTED_SANITIZERS)
message(FATAL_ERROR "Error: Unsupported sanitizer '${san}'. Supported values are: ${SUPPORTED_SANITIZERS}.")
endif ()
add_executable(clio_server src/main/main.cpp)
target_link_libraries(clio_server PUBLIC clio)
# Sanitizers recommend minimum of -O1 for reasonable performance so we enable it for debug builds
set(SAN_OPTIMIZATION_FLAG "")
if (CMAKE_BUILD_TYPE STREQUAL "Debug")
set(SAN_OPTIMIZATION_FLAG -O1)
endif ()
target_compile_options(clio_options INTERFACE ${SAN_OPTIMIZATION_FLAG} ${SAN_FLAG} -fno-omit-frame-pointer)
if(BUILD_TESTS)
add_executable(clio_tests
unittests/RPCErrors.cpp
unittests/Backend.cpp
unittests/Logger.cpp
unittests/Config.cpp
unittests/ProfilerTest.cpp
unittests/DOSGuard.cpp)
include(CMake/deps/gtest.cmake)
endif()
target_compile_definitions(
clio_options INTERFACE $<$<STREQUAL:${san},address>:SANITIZER=ASAN> $<$<STREQUAL:${san},thread>:SANITIZER=TSAN>
$<$<STREQUAL:${san},memory>:SANITIZER=MSAN> $<$<STREQUAL:${san},undefined>:SANITIZER=UBSAN>
)
target_link_libraries(clio_options INTERFACE ${SAN_FLAG} ${SAN_LIB})
endif ()
# Generate `docs` target for doxygen documentation if enabled Note: use `make docs` to generate the documentation
if (docs)
add_subdirectory(docs)
endif ()
include(install/install)
if (package)
include(ClioPackage)
endif ()
if (snapshot)
add_subdirectory(tools/snapshot)
endif ()
include(CMake/install/install.cmake)
if(PACKAGING)
include(CMake/packaging.cmake)
endif()

View File

@@ -1,57 +1,33 @@
# Contributing
Thank you for your interest in contributing to the `clio` project 🙏
## Workflow
To contribute, please:
1. Fork the repository under your own user.
2. Create a new branch on which to commit/push your changes.
2. Create a new branch on which to write your changes.
3. Write and test your code.
4. Ensure that your code compiles with the provided build engine and update the provided build engine as part of your PR where needed and where appropriate.
5. Where applicable, write test cases for your code and include those in the relevant subfolder under `tests`.
6. Ensure your code passes [automated checks](#pre-commit-hooks)
7. Squash your commits (i.e. rebase) into as few commits as is reasonable to describe your changes at a high level (typically a single commit for a small change). See below for more details.
5. Where applicable, write test cases for your code and include those in `unittests`.
6. Ensure your code passes automated checks (e.g. clang-format)
7. Squash your commits (i.e. rebase) into as few commits as is reasonable to describe your changes at a high level (typically a single commit for a small change.). See below for more details.
8. Open a PR to the main repository onto the _develop_ branch, and follow the provided template.
> **Note:** Please read the [Style guide](#style-guide).
> **Note:** Please make sure you read the [Style guide](#style-guide).
### `git lfs` hooks
## Install git hooks
Please make sure to run the following command in order to use git hooks that are helpful for `clio` development.
Install `git lfs` hooks using the following command:
```bash
git lfs install
``` bash
git config --local core.hooksPath .githooks
```
> **Note:** You need to install Git LFS hooks before installing `pre-commit` hooks.
### `pre-commit` hooks
To ensure code quality and style, we use [`pre-commit`](https://pre-commit.com/).
Run the following command to enable `pre-commit` hooks that help with Clio development:
```bash
pip3 install pre-commit
pre-commit install --hook-type pre-commit --hook-type pre-push
```
`pre-commit` takes care of running each tool in [`.pre-commit-config.yaml`](https://github.com/XRPLF/clio/blob/develop/.pre-commit-config.yaml) in a separate environment.
`pre-commit` also attempts to automatically use Doxygen to verify that everything public in the codebase has doc comments.
If Doxygen is not installed, the hook issues a warning and recommends installing Doxygen for future commits.
### Git commands
This sections offers a detailed look at the git commands you will need to use to get your PR submitted.
Please note that there are more than one way to do this and these commands are provided for your convenience.
## Git commands
This sections offers a detailed look at the git commands you will need to use to get your PR submitted.
Please note that there are more than one way to do this and these commands are only provided for your convenience.
At this point it's assumed that you have already finished working on your feature/bug.
> **Important:** Before you issue any of the commands below, please hit the `Sync fork` button and make sure your fork's `develop` branch is up-to-date with the main `clio` repository.
> **Important:** Before you issue any of the commands below, please hit the `Sync fork` button and make sure your fork's `develop` branch is up to date with the main `clio` repository.
```bash
``` bash
# Create a backup of your branch
git branch <your feature branch>_bk
@@ -61,128 +37,98 @@ git pull origin develop
git checkout <your feature branch>
git rebase -i develop
```
For each commit in the list other than the first one please select `s` to squash.
After this is done you will have the opportunity to write a message for the squashed commit.
For each commit in the list other than the first one, enter `s` to squash.
After this is done, you will have the opportunity to write a message for the squashed commit.
> **Hint:** Please use **imperative mood** commit message capitalizing the first word of the subject.
> **Hint:** Please use **imperative mood** in the commit message, and capitalize the first word.
```bash
``` bash
# You should now have a single commit on top of a commit in `develop`
git log
```
> **Todo:** In case there are merge conflicts, please resolve them now
> **Note:** If there are merge conflicts, please resolve them now.
```bash
``` bash
# Use the same commit message as you did above
git commit -m 'Your message'
git rebase --continue
```
> **Important:** If you have no GPG keys set up, please follow [this tutorial](https://docs.github.com/en/authentication/managing-commit-signature-verification/adding-a-gpg-key-to-your-github-account)
> **Important:** If you have no GPG keys setup please follow [this tutorial](https://docs.github.com/en/authentication/managing-commit-signature-verification/adding-a-gpg-key-to-your-github-account)
```bash
# Sign the commit with your GPG key, and push your changes
``` bash
# Sign the commit with your GPG key and finally push your changes to the repo
git commit --amend -S
git push --force
```
### Use ccache (optional)
Clio uses `ccache` to speed up compilation. If you want to use it, please make sure it is installed on your machine.
CMake will automatically detect it and use it if it is available.
### Opening a pull request
When a pull request is open CI will perform checks on the new code.
Title of the pull request and squashed commit should follow [conventional commits specification](https://www.conventionalcommits.org/en/v1.0.0/).
### Fixing issues found during code review
While your code is in review, it's possible that some changes will be requested by reviewer(s).
## Fixing issues found during code review
While your code is in review it's possible that some changes will be requested by the reviewer.
This section describes the process of adding your fixes.
We assume that you already made the required changes on your feature branch.
```bash
``` bash
# Add the changed code
git add <paths to add>
# Add a [FOLD] commit message (so you remember to squash it later)
# Add a folded commit message (so you can squash them later)
# while also signing it with your GPG key
git commit -S -m "[FOLD] Your commit message"
# And finally push your changes
git push
```
## After code review
Last but not least, when your PR is approved you still have to `Squash and merge` your code.
Luckily there is a button for that towards the bottom of the PR's page on github.
### After code review
When your PR is approved and ready to merge, use `Squash and merge`.
The button for that is near the bottom of the PR's page on GitHub.
> **Important:** Please leave the automatically-generated mention/link to the PR in the subject line **and** in the description field add `"Fix #ISSUE_ID"` (replacing `ISSUE_ID` with yours) if the PR fixes an issue.
> **Important:** Please leave the automatically generated link to PR in the subject line **and** in the description field please add `"Fixes #ISSUE_ID"` (replacing `ISSUE_ID` with yours).
> **Note:** See [issues](https://github.com/XRPLF/clio/issues) to find the `ISSUE_ID` for the feature/bug you were working on.
## Style guide
# Style guide
This is a non-exhaustive list of recommended style guidelines. These are not always strictly enforced and serve as a way to keep the codebase coherent rather than a set of _thou shalt not_ commandments.
This is a non-exhaustive list of recommended style guidelines. These are not always strictly enforced and serve as a way to keep the codebase coherent.
## Formatting
All code must conform to `clang-format` version 10, unless the result would be unreasonably difficult to read or maintain.
To change your code to conform use `clang-format -i <your changed files>`.
### Formatting
## Avoid
* Proliferation of nearly identical code.
* Proliferation of new files and classes unless it improves readability or/and compilation time.
* Unmanaged memory allocation and raw pointers.
* Macros (unless they add significant value.)
* Lambda patterns (unless these add significant value.)
* CPU or architecture-specific code unless there is a good reason to include it, and where it is used guard it with macros and provide explanatory comments.
* Importing new libraries unless there is a very good reason to do so.
Code must conform to `clang-format`, unless the result is unreasonably difficult to read or maintain.
In most cases the `pre-commit` hook takes care of formatting and fixes any issues automatically.
To manually format your code, run `pre-commit run clang-format --files <your changed files>` for C++ files, and `pre-commit run cmake-format --files <your changed files>` for CMake files.
### Documentation
All public namespaces, classes and functions must be covered by doc (`doxygen`) comments. Everything that is not within a nested `impl` namespace is considered public.
> **Note:** Keep in mind that this is enforced by Clio's CI and your build will fail if newly added public code lacks documentation.
### Avoid
- Proliferation of nearly identical code.
- Proliferation of new files and classes unless it improves readability or/and compilation time.
- Unmanaged memory allocation and raw pointers.
- Macros (unless they add significant value.)
- Lambda patterns (unless these add significant value.)
- CPU or architecture-specific code unless there is a good reason to include it, and where it is used guard it with macros and provide explanatory comments.
- Importing new libraries unless there is a very good reason to do so.
### Seek to
- Extend functionality of existing code rather than creating new code.
- Prefer readability over terseness where important logic is concerned.
- Inline functions that are not used or are not likely to be used elsewhere in the codebase.
- Use clear and self-explanatory names for functions, variables, structs and classes.
- Use TitleCase for classes, structs and filenames, camelCase for function and variable names, lower case for namespaces and folders.
- Provide as many comments as you feel that a competent programmer would need to understand what your code does.
## Maintainers
## Seek to
* Extend functionality of existing code rather than creating new code.
* Prefer readability over terseness where important logic is concerned.
* Inline functions that are not used or are not likely to be used elsewhere in the codebase.
* Use clear and self-explanatory names for functions, variables, structs and classes.
* Use TitleCase for classes, structs and filenames, camelCase for function and variable names, lower case for namespaces and folders.
* Provide as many comments as you feel that a competent programmer would need to understand what your code does.
# Maintainers
Maintainers are ecosystem participants with elevated access to the repository. They are able to push new code, make decisions on when a release should be made, etc.
### Code Review
A PR must be reviewed and approved by at least one of the maintainers before it can be merged.
### Adding and Removing
## Code Review
PRs must be reviewed by at least one of the maintainers.
## Adding and Removing
New maintainers can be proposed by two existing maintainers, subject to a vote by a quorum of the existing maintainers. A minimum of 50% support and a 50% participation is required. In the event of a tie vote, the addition of the new maintainer will be rejected.
Existing maintainers can resign, or be subject to a vote for removal at the behest of two existing maintainers. A minimum of 60% agreement and 50% participation are required. The XRP Ledger Foundation will have the ability, for cause, to remove an existing maintainer without a vote.
### Existing Maintainers
## Existing Maintainers
- [godexsoft](https://github.com/godexsoft) (Ripple)
- [kuznetsss](https://github.com/kuznetsss) (Ripple)
- [legleux](https://github.com/legleux) (Ripple)
- [PeterChen13579](https://github.com/PeterChen13579) (Ripple)
* [cjcobb23](https://github.com/cjcobb23) (Ripple)
* [legleux](https://github.com/legleux) (Ripple)
* [undertome](https://github.com/undertome) (Ripple)
* [godexsoft](https://github.com/godexsoft) (Ripple)
* [officialfrancismendoza](https://github.com/officialfrancismendoza) (Ripple)
### Honorable ex-Maintainers
## Honorable ex-Maintainers
- [cindyyan317](https://github.com/cindyyan317) (ex-Ripple)
- [cjcobb23](https://github.com/cjcobb23) (ex-Ripple)
- [natenichols](https://github.com/natenichols) (ex-Ripple)
* [natenichols](https://github.com/natenichols) (ex-Ripple)

3
Doxyfile Normal file
View File

@@ -0,0 +1,3 @@
PROJECT_NAME = "Clio"
INPUT = src
RECURSIVE = YES

View File

@@ -1,7 +1,8 @@
ISC License
Copyright (c) 2022, the clio developers
Copyright (c) 2022, the clio developers
Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

250
README.md
View File

@@ -1,48 +1,236 @@
# <img src='./docs/img/xrpl-logo.svg' width='40' valign="top" /> Clio <!-- markdownlint-disable-line MD033 MD045 -->
# Clio
Clio is an XRP Ledger API server. Clio is optimized for RPC calls, over WebSocket or JSON-RPC. Validated
historical ledger and transaction data are stored in a more space-efficient format,
using up to 4 times less space than rippled. Clio can be configured to store data in Apache Cassandra or ScyllaDB,
allowing for scalable read throughput. Multiple Clio nodes can share
access to the same dataset, allowing for a highly available cluster of Clio nodes,
without the need for redundant data storage or computation.
[![Build status](https://github.com/XRPLF/clio/actions/workflows/build.yml/badge.svg?branch=develop)](https://github.com/XRPLF/clio/actions/workflows/build.yml?query=branch%3Adevelop)
[![Nightly release status](https://github.com/XRPLF/clio/actions/workflows/nightly.yml/badge.svg?branch=develop)](https://github.com/XRPLF/clio/actions/workflows/nightly.yml?query=branch%3Adevelop)
[![Clang-tidy checks status](https://github.com/XRPLF/clio/actions/workflows/clang-tidy.yml/badge.svg?branch=develop)](https://github.com/XRPLF/clio/actions/workflows/clang-tidy.yml?query=branch%3Adevelop)
[![Code coverage develop branch](https://codecov.io/gh/XRPLF/clio/branch/develop/graph/badge.svg?)](https://app.codecov.io/gh/XRPLF/clio)
Clio offers the full rippled API, with the caveat that Clio by default only returns validated data.
This means that `ledger_index` defaults to `validated` instead of `current` for all requests.
Other non-validated data is also not returned, such as information about queued transactions.
For requests that require access to the p2p network, such as `fee` or `submit`, Clio automatically forwards the request to a rippled node and propagates the response back to the client. To access non-validated data for *any* request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to rippled.
Clio is an XRP Ledger API server optimized for RPC calls over WebSocket or JSON-RPC.
It stores validated historical ledger and transaction data in a more space efficient format, and uses up to 4 times less space than [rippled](https://github.com/XRPLF/rippled).
Clio does not connect to the peer-to-peer network. Instead, Clio extracts data from a group of specified rippled nodes. Running Clio requires access to at least one rippled node
from which data can be extracted. The rippled node does not need to be running on the same machine as Clio.
Clio can be configured to store data in [Apache Cassandra](https://cassandra.apache.org/_/index.html) or [ScyllaDB](https://www.scylladb.com/), enabling scalable read throughput.
Multiple Clio nodes can share access to the same dataset, which allows for a highly available cluster of Clio nodes without the need for redundant data storage or computation.
## 📡 Clio and `rippled`
## Requirements
1. Access to a Cassandra cluster or ScyllaDB cluster. Can be local or remote.
Clio offers the full `rippled` API, with the caveat that Clio by default only returns validated data. This means that `ledger_index` defaults to `validated` instead of `current` for all requests. Other non-validated data, such as information about queued transactions, is also not returned.
2. Access to one or more rippled nodes. Can be local or remote.
Clio retrieves data from a designated group of `rippled` nodes instead of connecting to the peer-to-peer network.
For requests that require access to the peer-to-peer network, such as `fee` or `submit`, Clio automatically forwards the request to a `rippled` node and propagates the response back to the client. To access non-validated data for _any_ request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to `rippled`.
## Building
> [!NOTE]
> Clio requires access to at least one `rippled` node, which can run on the same machine as Clio or separately.
Clio is built with CMake. Clio requires at least GCC-11/clang-14.0.0 (C++20), and Boost 1.75.0.
## 📚 Learn more about Clio
Use these instructions to build a Clio executable from the source. These instructions were tested on Ubuntu 20.04 LTS.
Below are some useful docs to learn more about Clio.
```sh
# Install dependencies
sudo apt-get -y install git pkg-config protobuf-compiler libprotobuf-dev libssl-dev wget build-essential bison flex autoconf cmake clang-format
**For Developers**:
# Compile Boost
wget -O $HOME/boost_1_75_0.tar.gz https://boostorg.jfrog.io/artifactory/main/release/1.75.0/source/boost_1_75_0.tar.gz
tar xvzf $HOME/boost_1_75_0.tar.gz
cd $HOME/boost_1_75_0
./bootstrap.sh
./b2 -j$(nproc)
echo "export BOOST_ROOT=$HOME/boost_1_75_0" >> $HOME/.profile && source $HOME/.profile
- [How to build Clio](./docs/build-clio.md)
- [Coverage report](./docs/coverage-report.md)
# Clone the Clio Git repository & build Clio
cd $HOME
git clone https://github.com/XRPLF/clio.git
cd $HOME/clio
cmake -B build && cmake --build build --parallel $(nproc)
```
**For Operators**:
## Running
```sh
./clio_server config.json
```
- [How to configure Clio and rippled](./docs/configure-clio.md)
- [How to run Clio](./docs/run-clio.md)
- [Troubleshooting guide](./docs/trouble_shooting.md)
Clio needs access to a rippled server. The config files of rippled and Clio need
to match in a certain sense.
Clio needs to know:
- the IP of rippled
- the port on which rippled is accepting unencrypted WebSocket connections
- the port on which rippled is handling gRPC requests
**General reference material:**
rippled needs to open:
- a port to accept unencrypted websocket connections
- a port to handle gRPC requests, with the IP(s) of Clio specified in the `secure_gateway` entry
- [API reference](https://xrpl.org/http-websocket-apis.html)
- [Developer docs](https://xrplf.github.io/clio)
- [Clio documentation](https://xrpl.org/the-clio-server.html#the-clio-server)
The example configs of rippled and Clio are setups such that minimal changes are
required. When running locally, the only change needed is to uncomment the `port_grpc`
section of the rippled config. When running Clio and rippled on separate machines,
in addition to uncommenting the `port_grpc` section, a few other steps must be taken:
1. change the `ip` of the first entry of `etl_sources` to the IP where your rippled
server is running
2. open a public, unencrypted WebSocket port on your rippled server
3. change the IP specified in `secure_gateway` of `port_grpc` section of the rippled config
to the IP of your Clio server. This entry can take the form of a comma-separated list if
you are running multiple Clio nodes.
## 🆘 Help
Feel free to open an [issue](https://github.com/XRPLF/clio/issues) if you have a feature request or something doesn't work as expected.
If you have any questions about building, running, contributing, using Clio or any other, you could always start a new [discussion](https://github.com/XRPLF/clio/discussions).
In addition, the parameter `start_sequence` can be included and configured within the top level of the config file. This parameter specifies the sequence of first ledger to extract if the database is empty. Note that ETL extracts ledgers in order and that no backfilling functionality currently exists, meaning Clio will not retroactively learn ledgers older than the one you specify. Choosing to specify this or not will yield the following behavior:
- If this setting is absent and the database is empty, ETL will start with the next ledger validated by the network.
- If this setting is present and the database is not empty, an exception is thrown.
In addition, the optional parameter `finish_sequence` can be added to the json file as well, specifying where the ledger can stop.
To add `start_sequence` and/or `finish_sequence` to the config.json file appropriately, they will be on the same top level of precedence as other parameters (such as `database`, `etl_sources`, `read_only`, etc.) and be specified with an integer. Here is an example snippet from the config file:
```json
"start_sequence": 12345,
"finish_sequence": 54321
```
The parameters `ssl_cert_file` and `ssl_key_file` can also be added to the top level of precedence of our Clio config. `ssl_cert_file` specifies the filepath for your SSL cert while `ssl_key_file` specifies the filepath for your SSL key. It is up to you how to change ownership of these folders for your designated Clio user. Your options include:
- Copying the two files as root somewhere that's accessible by the Clio user, then running `sudo chown` to your user
- Changing the permissions directly so it's readable by your Clio user
- Running Clio as root (strongly discouraged)
An example of how to specify `ssl_cert_file` and `ssl_key_file` in the config:
```json
"server":{
"ip": "0.0.0.0",
"port": 51233
},
"ssl_cert_file" : "/full/path/to/cert.file",
"ssl_key_file" : "/full/path/to/key.file"
```
Once your config files are ready, start rippled and Clio. It doesn't matter which you
start first, and it's fine to stop one or the other and restart at any given time.
Clio will wait for rippled to sync before extracting any ledgers. If there is already
data in Clio's database, Clio will begin extraction with the ledger whose sequence
is one greater than the greatest sequence currently in the database. Clio will wait
for this ledger to be available. Be aware that the behavior of rippled is to sync to
the most recent ledger on the network, and then backfill. If Clio is extracting ledgers
from rippled, and then rippled is stopped for a significant amount of time and then restarted, rippled
will take time to backfill to the next ledger that Clio wants. The time it takes is proportional
to the amount of time rippled was offline for. Also be aware that the amount rippled backfills
are dependent on the online_delete and ledger_history config values; if these values
are small, and rippled is stopped for a significant amount of time, rippled may never backfill
to the ledger that Clio wants. To avoid this situation, it is advised to keep history
proportional to the amount of time that you expect rippled to be offline. For example, if you
expect rippled to be offline for a few days from time to time, you should keep at least
a few days of history. If you expect rippled to never be offline, then you can keep a very small
amount of history.
Clio can use multiple rippled servers as a data source. Simply add more entries to
the `etl_sources` section. Clio will load balance requests across the servers specified
in this list. As long as one rippled server is up and synced, Clio will continue
extracting ledgers.
In contrast to rippled, Clio will answer RPC requests for the data already in the
database as soon as the server starts. Clio doesn't wait to sync to the network, or
for rippled to sync.
When starting Clio with a fresh database, Clio needs to download a ledger in full.
This can take some time, and depends on database throughput. With a moderately fast
database, this should take less than 10 minutes. If you did not properly set `secure_gateway`
in the `port_grpc` section of rippled, this step will fail. Once the first ledger
is fully downloaded, Clio only needs to extract the changed data for each ledger,
so extraction is much faster and Clio can keep up with rippled in real-time. Even under
intense load, Clio should not lag behind the network, as Clio is not processing the data,
and is simply writing to a database. The throughput of Clio is dependent on the throughput
of your database, but a standard Cassandra or Scylla deployment can handle
the write load of the XRP Ledger without any trouble. Generally the performance considerations
come on the read side, and depends on the number of RPC requests your Clio nodes
are serving. Be aware that very heavy read traffic can impact write throughput. Again, this
is on the database side, so if you are seeing this, upgrade your database.
It is possible to run multiple Clio nodes that share access to the same database.
The Clio nodes don't need to know about each other. You can simply spin up more Clio
nodes pointing to the same database as you wish, and shut them down as you wish.
On startup, each Clio node queries the database for the latest ledger. If this latest
ledger does not change for some time, the Clio node begins extracting ledgers
and writing to the database. If the Clio node detects a ledger that it is trying to
write has already been written, the Clio node will backoff and stop writing. If later
the Clio node sees no ledger written for some time, it will start writing again.
This algorithm ensures that at any given time, one and only one Clio node is writing
to the database.
It is possible to force Clio to only read data, and to never become a writer.
To do this, set `read_only: true` in the config. One common setup is to have a
small number of writer nodes that are inaccessible to clients, with several
read only nodes handling client requests. The number of read only nodes can be scaled
up or down in response to request volume.
When using multiple rippled servers as data sources and multiple Clio nodes,
each Clio node should use the same set of rippled servers as sources. The order doesn't matter.
The only reason not to do this is if you are running servers in different regions, and
you want the Clio nodes to extract from servers in their region. However, if you
are doing this, be aware that database traffic will be flowing across regions,
which can cause high latencies. A possible alternative to this is to just deploy
a database in each region, and the Clio nodes in each region use their region's database.
This is effectively two systems.
## Developing against `rippled` in standalone mode
If you wish you develop against a `rippled` instance running in standalone
mode there are a few quirks of both clio and rippled you need to keep in mind.
You must:
1. Advance the `rippled` ledger to at least ledger 256
2. Wait 10 minutes before first starting clio against this standalone node.
## Logging
Clio provides several logging options, all are configurable via the config file and are detailed below.
`log_level`: The minimum level of severity at which the log message will be outputted by default.
Severity options are `trace`, `debug`, `info`, `warning`, `error`, `fatal`. Defaults to `info`.
`log_format`: The format of log lines produced by clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
Each of the variables expands like so
- `TimeStamp`: The full date and time of the log entry
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
- `ThreadID`: The ID of the thread the log entry is written from
- `Channel`: The channel that this log entry was sent to
- `Severity`: The severity (aka log level) the entry was sent at
- `Message`: The actual log message
`log_channels`: An array of json objects, each overriding properties for a logging `channel`.
At the moment of writing, only `log_level` can be overriden using this mechanism.
Each object is of this format:
```json
{
"channel": "Backend",
"log_level": "fatal"
}
```
If no override is present for a given channel, that channel will log at the severity specified by the global `log_level`.
Overridable log channels: `Backend`, `WebServer`, `Subscriptions`, `RPC`, `ETL` and `Performance`.
> **Note:** See `example-config.json` for more details.
`log_to_console`: Enable/disable log output to console. Options are `true`/`false`. Defaults to true.
`log_directory`: Path to the directory where log files are stored. If such directory doesn't exist, Clio will create it. If not specified, logs are not written to a file.
`log_rotation_size`: The max size of the log file in **megabytes** before it will rotate into a smaller file. Defaults to 2GB.
`log_directory_max_size`: The max size of the log directory in **megabytes** before old log files will be
deleted to free up space. Defaults to 50GB.
`log_rotation_hour_interval`: The time interval in **hours** after the last log rotation to automatically
rotate the current log file. Defaults to 12 hours.
Note, time-based log rotation occurs dependently on size-based log rotation, where if a
size-based log rotation occurs, the timer for the time-based rotation will reset.
`log_tag_style`: Tag implementation to use. Must be one of:
- `uint`: Lock free and threadsafe but outputs just a simple unsigned integer
- `uuid`: Threadsafe and outputs a UUID tag
- `none`: Don't use tagging at all
## Cassandra / Scylla Administration
Since Clio relies on either Cassandra or Scylla for its database backend, here are some important considerations:
- Scylla, by default, will reserve all free RAM on a machine for itself. If you are running `rippled` or other services on the same machine, restrict its memory usage using the `--memory` argument: https://docs.scylladb.com/getting-started/scylla-in-a-shared-environment/

23
RELEASENOTES.md Normal file
View File

@@ -0,0 +1,23 @@
# Release Notes
This document contains the release notes for `clio_server`, an XRP Ledger API Server.
To build and run `clio_server`, follow the instructions in [README.md](https://github.com/XRPLF/clio).
If you find issues or have a new idea, please open [an issue](https://github.com/XRPLF/clio/issues).
# Releases
## 0.1.0
Clio is an XRP Ledger API server. Clio is optimized for RPC calls, over websocket or JSON-RPC. Validated historical ledger and transaction data is stored in a more space efficient format, using up to 4 times less space than rippled.
Clio uses Cassandra or ScyllaDB, allowing for scalable read throughput. Multiple clio nodes can share access to the same dataset, allowing for a highly available cluster of clio nodes, without the need for redundant data storage or computation.
**0.1.0** is the first beta of Project Clio. It contains:
- `./src/backend` is the BackendInterface. This provides an abstraction for reading and writing information to a database.
- `./src/etl` is the ReportingETL. The classes in this folder are used to extract information from the P2P network and write it to a database, either locally or over the network.
- `./src/rpc` contains RPC handlers that are called by clients. These handlers should expose the same API as rippled.
- `./src/subscriptions` contains the SubscriptionManager. This manages publishing to clients subscribing to streams or accounts.
- `./src/webserver` contains a flex server that handles both http/s and ws/s traffic on a single port.
- `./unittests` simple unit tests that write to and read from a database to verify that the ETL works.

View File

@@ -1,18 +0,0 @@
add_executable(clio_benchmark)
target_sources(
clio_benchmark
PRIVATE # Common
Main.cpp
Playground.cpp
# ExecutionContext
util/async/ExecutionContextBenchmarks.cpp
# Logger
util/log/LoggerBenchmark.cpp
)
include(deps/gbench)
target_include_directories(clio_benchmark PRIVATE .)
target_link_libraries(clio_benchmark PUBLIC clio_util benchmark::benchmark_main spdlog::spdlog)
set_target_properties(clio_benchmark PROPERTIES RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR})

View File

@@ -1,22 +0,0 @@
//------------------------------------------------------------------------------
/*
This file is part of clio: https://github.com/XRPLF/clio
Copyright (c) 2023, the clio developers.
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
//==============================================================================
#include <benchmark/benchmark.h>
BENCHMARK_MAIN();

View File

@@ -1,45 +0,0 @@
//------------------------------------------------------------------------------
/*
This file is part of clio: https://github.com/XRPLF/clio
Copyright (c) 2023, the clio developers.
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
//==============================================================================
/*
* Use this file for temporary benchmarks and implementations.
* Usage example:
* ```
* ./clio_benchmarks
* --benchmark_time_unit=ms
* --benchmark_repetitions=10
* --benchmark_display_aggregates_only=true
* --benchmark_min_time=1x
* --benchmark_filter="Playground"
* ```
*
* Note: Please don't push your temporary work to the repo.
*/
// #include <benchmark/benchmark.h>
// static void
// benchmarkPlaygroundTest1(benchmark::State& state)
// {
// for (auto _ : state) {
// // ...
// }
// }
// BENCHMARK(benchmarkPlaygroundTest1);

View File

@@ -1,268 +0,0 @@
//------------------------------------------------------------------------------
/*
This file is part of clio: https://github.com/XRPLF/clio
Copyright (c) 2024, the clio developers.
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
//==============================================================================
#include "etl/ETLHelpers.hpp"
#include "util/Random.hpp"
#include "util/async/AnyExecutionContext.hpp"
#include "util/async/AnyOperation.hpp"
#include "util/async/context/BasicExecutionContext.hpp"
#include "util/async/context/SyncExecutionContext.hpp"
#include <benchmark/benchmark.h>
#include <chrono>
#include <cstddef>
#include <cstdint>
#include <latch>
#include <optional>
#include <thread>
#include <vector>
using namespace util;
using namespace util::async;
class TestThread {
std::vector<std::thread> threads_;
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
etl::ThreadSafeQueue<uint64_t> res_;
public:
TestThread(std::vector<uint64_t> const& data) : q_(data.size()), res_(data.size())
{
for (auto el : data)
q_.push(el);
}
~TestThread()
{
for (auto& t : threads_) {
if (t.joinable())
t.join();
}
}
void
run(std::size_t numThreads)
{
std::latch completion{numThreads};
for (std::size_t i = 0; i < numThreads; ++i) {
q_.push(std::nullopt);
threads_.emplace_back([this, &completion]() { process(completion); });
}
completion.wait();
}
private:
void
process(std::latch& completion)
{
while (auto v = q_.pop()) {
if (not v.has_value())
break;
res_.push(v.value() * v.value());
}
completion.count_down(1);
}
};
template <typename CtxType>
class TestExecutionContextBatched {
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
etl::ThreadSafeQueue<uint64_t> res_;
std::size_t batchSize_;
public:
TestExecutionContextBatched(std::vector<uint64_t> const& data, std::size_t batchSize = 5000u)
: q_(data.size()), res_(data.size()), batchSize_(batchSize)
{
for (auto el : data)
q_.push(el);
}
void
run(std::size_t numThreads)
{
using OpType = typename CtxType::template StoppableOperation<void>;
CtxType ctx{numThreads};
std::vector<OpType> operations;
for (std::size_t i = 0; i < numThreads; ++i) {
q_.push(std::nullopt);
operations.push_back(ctx.execute(
[this](auto stopRequested) {
bool hasMore = true;
auto doOne = [this] {
auto v = q_.pop();
if (not v.has_value())
return false;
res_.push(v.value() * v.value());
return true;
};
while (not stopRequested and hasMore) {
for (std::size_t i = 0; i < batchSize_ and hasMore; ++i)
hasMore = doOne();
}
},
std::chrono::seconds{5}
));
}
for (auto& op : operations)
op.wait();
}
};
template <typename CtxType>
class TestAnyExecutionContextBatched {
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
etl::ThreadSafeQueue<uint64_t> res_;
std::size_t batchSize_;
public:
TestAnyExecutionContextBatched(std::vector<uint64_t> const& data, std::size_t batchSize = 5000u)
: q_(data.size()), res_(data.size()), batchSize_(batchSize)
{
for (auto el : data)
q_.push(el);
}
void
run(std::size_t numThreads)
{
CtxType ctx{numThreads};
AnyExecutionContext anyCtx{ctx};
std::vector<AnyOperation<void>> operations;
for (std::size_t i = 0; i < numThreads; ++i) {
q_.push(std::nullopt);
operations.push_back(anyCtx.execute(
[this](auto stopRequested) {
bool hasMore = true;
auto doOne = [this] {
auto v = q_.pop();
if (not v.has_value())
return false;
res_.push(v.value() * v.value());
return true;
};
while (not stopRequested and hasMore) {
for (std::size_t i = 0; i < batchSize_ and hasMore; ++i)
hasMore = doOne();
}
},
std::chrono::seconds{5}
));
}
for (auto& op : operations)
op.wait();
}
};
static auto
generateData()
{
constexpr auto kTOTAL = 10'000;
std::vector<uint64_t> data;
data.reserve(kTOTAL);
util::MTRandomGenerator randomGenerator;
for (auto i = 0; i < kTOTAL; ++i)
data.push_back(randomGenerator.uniform(1, 100'000'000));
return data;
}
static void
benchmarkThreads(benchmark::State& state)
{
auto data = generateData();
for (auto _ : state) {
TestThread t{data};
t.run(state.range(0));
}
}
template <typename CtxType>
static void
benchmarkExecutionContextBatched(benchmark::State& state)
{
auto data = generateData();
for (auto _ : state) {
TestExecutionContextBatched<CtxType> t{data, state.range(1)};
t.run(state.range(0));
}
}
template <typename CtxType>
static void
benchmarkAnyExecutionContextBatched(benchmark::State& state)
{
auto data = generateData();
for (auto _ : state) {
TestAnyExecutionContextBatched<CtxType> t{data, state.range(1)};
t.run(state.range(0));
}
}
// Simplest implementation using async queues and std::thread
BENCHMARK(benchmarkThreads)->Arg(1)->Arg(2)->Arg(4)->Arg(8);
// Same implementation using each of the available execution contexts
BENCHMARK(benchmarkExecutionContextBatched<PoolExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});
BENCHMARK(benchmarkExecutionContextBatched<CoroExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});
BENCHMARK(benchmarkExecutionContextBatched<SyncExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});
// Same implementations going thru AnyExecutionContext
BENCHMARK(benchmarkAnyExecutionContextBatched<PoolExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});
BENCHMARK(benchmarkAnyExecutionContextBatched<CoroExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});
BENCHMARK(benchmarkAnyExecutionContextBatched<SyncExecutionContext>)
->ArgsProduct({
{1, 2, 4, 8}, // threads
{500, 1000, 5000, 10000} // batch size
});

View File

@@ -1,149 +0,0 @@
//------------------------------------------------------------------------------
/*
This file is part of clio: https://github.com/XRPLF/clio
Copyright (c) 2025, the clio developers.
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
//==============================================================================
#include "util/config/ConfigDefinition.hpp"
#include "util/log/Logger.hpp"
#include "util/prometheus/Prometheus.hpp"
#include <benchmark/benchmark.h>
#include <fmt/format.h>
#include <spdlog/async.h>
#include <spdlog/async_logger.h>
#include <spdlog/spdlog.h>
#include <barrier>
#include <chrono>
#include <cstddef>
#include <filesystem>
#include <memory>
#include <string>
#include <thread>
#include <utility>
#include <vector>
using namespace util;
static constexpr auto kLOG_FORMAT = "%Y-%m-%d %H:%M:%S.%f %^%3!l:%n%$ - %v";
struct BenchmarkLoggingInitializer {
[[nodiscard]] static std::shared_ptr<spdlog::sinks::sink>
createFileSink(LogService::FileLoggingParams const& params)
{
return LogService::createFileSink(params, kLOG_FORMAT);
}
static Logger
getLogger(std::shared_ptr<spdlog::logger> logger)
{
return Logger(std::move(logger));
}
};
namespace {
std::string
uniqueLogDir()
{
auto const epochTime = std::chrono::high_resolution_clock::now().time_since_epoch();
auto const tmpDir = std::filesystem::temp_directory_path();
std::string const dirName =
fmt::format("logs_{}", std::chrono::duration_cast<std::chrono::microseconds>(epochTime).count());
return tmpDir / "clio_benchmark" / dirName;
}
} // anonymous namespace
static void
benchmarkConcurrentFileLogging(benchmark::State& state)
{
auto const numThreads = static_cast<size_t>(state.range(0));
auto const messagesPerThread = static_cast<size_t>(state.range(1));
PrometheusService::init(config::getClioConfig());
auto const logDir = uniqueLogDir();
for (auto _ : state) {
state.PauseTiming();
std::filesystem::create_directories(logDir);
static constexpr size_t kQUEUE_SIZE = 8192;
static constexpr size_t kTHREAD_COUNT = 1;
spdlog::init_thread_pool(kQUEUE_SIZE, kTHREAD_COUNT);
auto fileSink = BenchmarkLoggingInitializer::createFileSink({
.logDir = logDir,
.rotationSizeMB = 5,
.dirMaxFiles = 25,
});
std::vector<std::thread> threads;
threads.reserve(numThreads);
std::chrono::high_resolution_clock::time_point start;
std::barrier barrier(numThreads, [&state, &start]() {
state.ResumeTiming();
start = std::chrono::high_resolution_clock::now();
});
for (size_t threadNum = 0; threadNum < numThreads; ++threadNum) {
threads.emplace_back([threadNum, messagesPerThread, fileSink, &barrier]() {
std::string const channel = fmt::format("Thread_{}", threadNum);
auto logger = std::make_shared<spdlog::async_logger>(
channel, fileSink, spdlog::thread_pool(), spdlog::async_overflow_policy::block
);
spdlog::register_logger(logger);
Logger const threadLogger = BenchmarkLoggingInitializer::getLogger(std::move(logger));
barrier.arrive_and_wait();
for (size_t messageNum = 0; messageNum < messagesPerThread; ++messageNum) {
LOG(threadLogger.info()) << "Test log message #" << messageNum;
}
});
}
for (auto& thread : threads) {
thread.join();
}
spdlog::shutdown();
auto const end = std::chrono::high_resolution_clock::now();
state.SetIterationTime(std::chrono::duration_cast<std::chrono::duration<double>>(end - start).count());
std::filesystem::remove_all(logDir);
}
auto const totalMessages = numThreads * messagesPerThread;
state.counters["TotalMessagesRate"] = benchmark::Counter(totalMessages, benchmark::Counter::kIsRate);
state.counters["Threads"] = numThreads;
state.counters["MessagesPerThread"] = messagesPerThread;
}
// One line of log message is around 110 bytes
// So, 100K messages is around 10.5MB
BENCHMARK(benchmarkConcurrentFileLogging)
->ArgsProduct({
// Number of threads
{1, 2, 4, 8},
// Messages per thread
{10'000, 100'000, 500'000, 1'000'000, 10'000'000},
})
->UseManualTime()
->Unit(benchmark::kMillisecond);

View File

@@ -1,101 +0,0 @@
# git-cliff ~ default configuration file
# https://git-cliff.org/docs/configuration
#
# Lines starting with "#" are comments.
# Configuration options are organized into tables and keys.
# See documentation for more information on available options.
[changelog]
# template for the changelog header
header = """
"""
# template for the changelog body
# https://keats.github.io/tera/docs/#introduction
body = """
{% if version %}\
Version {{ version | trim_start_matches(pat="v") }} of Clio, an XRP Ledger API server optimized for HTTP and WebSocket API calls, is now available.
{% else %}\
Clio, an XRP Ledger API server optimized for HTTP and WebSocket API calls, is under active development.
{% endif %}\
<!-- Please, remove one of the 2 following lines -->
This release adds new features and bug fixes.
This release adds bug fixes.
\
{% if version %}
## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
{% else %}
## [unreleased]
{% endif %}\
{% for group, commits in commits | filter(attribute="merge_commit", value=false) | group_by(attribute="group") %}
### {{ group | striptags | trim | upper_first }}
{% for commit in commits %}
- {% if commit.scope %}*({{ commit.scope }})* {% endif %}\
{% if commit.breaking %}[**breaking**] {% endif %}\
{{ commit.message | upper_first }}{% if commit.remote.username %} by @{{ commit.remote.username }}{% endif %}\
{% endfor %}
{% endfor %}\n
"""
# template for the changelog footer
footer = """
<!-- generated by git-cliff -->
"""
# remove the leading and trailing s
trim = true
# postprocessors
postprocessors = [
# { pattern = '<REPO>', replace = "https://github.com/orhun/git-cliff" }, # replace repository URL
]
# render body even when there are no releases to process
# render_always = true
# output file path
output = "CHANGELOG.md"
[git]
# parse the commits based on https://www.conventionalcommits.org
conventional_commits = true
# filter out the commits that are not conventional
filter_unconventional = true
# process each line of a commit as an individual commit
split_commits = false
# regex for preprocessing the commit messages
commit_preprocessors = [
# Replace issue numbers
#{ pattern = '\((\w+\s)?#([0-9]+)\)', replace = "([#${2}](<REPO>/issues/${2}))"},
# Check spelling of the commit with https://github.com/crate-ci/typos
# If the spelling is incorrect, it will be automatically fixed.
#{ pattern = '.*', replace_command = 'typos --write-changes -' },
]
# regex for parsing and grouping commits
commit_parsers = [
{ message = "^feat", group = "<!-- 0 -->🚀 Features" },
{ message = "^fix", group = "<!-- 1 -->🐛 Bug Fixes" },
{ message = "^doc", group = "<!-- 3 -->📚 Documentation" },
{ message = "^perf", group = "<!-- 4 -->⚡ Performance" },
{ message = "^refactor", group = "<!-- 2 -->🚜 Refactor" },
{ message = "^style.*[Cc]lang-tidy auto fixes", skip = true },
{ message = "^style", group = "<!-- 5 -->🎨 Styling" },
{ message = "^test", group = "<!-- 6 -->🧪 Testing" },
{ message = "^chore\\(release\\): prepare for", skip = true },
{ message = "^chore: Commits", skip = true },
{ message = "^chore\\(deps.*\\)", skip = true },
{ message = "^chore\\(pr\\)", skip = true },
{ message = "^chore\\(pull\\)", skip = true },
{ message = "^chore|^ci", group = "<!-- 7 -->⚙️ Miscellaneous Tasks" },
{ body = ".*security", group = "<!-- 8 -->🛡️ Security" },
{ message = "^revert", group = "<!-- 9 -->◀️ Revert" },
{ message = ".*", group = "<!-- 10 -->💼 Other" },
]
# filter out the commits that are not matched by commit parsers
filter_commits = false
# sort the tags topologically
topo_order = false
# sort the commits inside sections by oldest/newest order
sort_commits = "oldest"
ignore_tags = "^.*-[b|rc].*"
[remote.github]
owner = "XRPLF"
repo = "clio"

38
cloud-example-config.json Normal file
View File

@@ -0,0 +1,38 @@
{
"database":
{
"type":"cassandra",
"cassandra":
{
"secure_connect_bundle":"[path/to/zip. ignore if using contact_points]",
"contact_points":"[ip. ignore if using secure_connect_bundle]",
"port":"[port. ignore if using_secure_connect_bundle]",
"keyspace":"clio",
"username":"[username, if any]",
"password":"[password, if any]",
"max_requests_outstanding":25000,
"threads":8
}
},
"etl_sources":
[
{
"ip":"[rippled ip]",
"ws_port":"6006",
"grpc_port":"50051"
}
],
"dos_guard":
{
"whitelist":["127.0.0.1"]
},
"server":{
"ip":"0.0.0.0",
"port":8080
},
"log_level":"debug",
"log_file":"./clio.log",
"online_delete":0,
"extractor_threads":8,
"read_only":false
}

View File

@@ -1,5 +0,0 @@
find_program(CCACHE_PATH "ccache")
if (CCACHE_PATH)
set(CMAKE_CXX_COMPILER_LAUNCHER "${CCACHE_PATH}")
message(STATUS "Using ccache: ${CCACHE_PATH}")
endif ()

View File

@@ -1,42 +0,0 @@
if (CMAKE_CXX_COMPILER_ID STREQUAL "Clang")
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 16)
message(FATAL_ERROR "Clang 16+ required for building clio")
endif ()
set(is_clang TRUE)
elseif (CMAKE_CXX_COMPILER_ID STREQUAL "AppleClang")
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 15)
message(FATAL_ERROR "AppleClang 15+ required for building clio")
endif ()
set(is_appleclang TRUE)
elseif (CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 12)
message(FATAL_ERROR "GCC 12+ required for building clio")
endif ()
set(is_gcc TRUE)
else ()
message(FATAL_ERROR "Supported compilers: AppleClang 15+, Clang 16+, GCC 12+")
endif ()
if (san)
string(TOLOWER ${san} san)
set(SAN_FLAG "-fsanitize=${san}")
set(SAN_LIB "")
if (is_gcc)
if (san STREQUAL "address")
set(SAN_LIB "asan")
elseif (san STREQUAL "thread")
set(SAN_LIB "tsan")
elseif (san STREQUAL "memory")
set(SAN_LIB "msan")
elseif (san STREQUAL "undefined")
set(SAN_LIB "ubsan")
endif ()
endif ()
set(_saved_CRL ${CMAKE_REQUIRED_LIBRARIES})
set(CMAKE_REQUIRED_LIBRARIES "${SAN_FLAG};${SAN_LIB}")
check_cxx_compiler_flag(${SAN_FLAG} COMPILER_SUPPORTS_SAN)
set(CMAKE_REQUIRED_LIBRARIES ${_saved_CRL})
if (NOT COMPILER_SUPPORTS_SAN)
message(FATAL_ERROR "${san} sanitizer does not seem to be supported by your compiler")
endif ()
endif ()

View File

@@ -1,33 +0,0 @@
if (lint)
# Find clang-tidy binary
if (DEFINED ENV{CLIO_CLANG_TIDY_BIN})
set(_CLANG_TIDY_BIN $ENV{CLIO_CLANG_TIDY_BIN})
if ((NOT EXISTS ${_CLANG_TIDY_BIN}) OR IS_DIRECTORY ${_CLANG_TIDY_BIN})
message(FATAL_ERROR "$ENV{CLIO_CLANG_TIDY_BIN} no such file. Check CLIO_CLANG_TIDY_BIN env variable")
endif ()
message(STATUS "Using clang-tidy from CLIO_CLANG_TIDY_BIN")
else ()
find_program(_CLANG_TIDY_BIN NAMES "clang-tidy-20" "clang-tidy" REQUIRED)
endif ()
if (NOT _CLANG_TIDY_BIN)
message(
FATAL_ERROR
"clang-tidy binary not found. Please set the CLIO_CLANG_TIDY_BIN environment variable or install clang-tidy."
)
endif ()
# Support for https://github.com/matus-chochlik/ctcache
find_program(CLANG_TIDY_CACHE_PATH NAMES "clang-tidy-cache")
if (CLANG_TIDY_CACHE_PATH)
set(_CLANG_TIDY_CMD "${CLANG_TIDY_CACHE_PATH};${_CLANG_TIDY_BIN}"
CACHE STRING "A combined command to run clang-tidy with caching wrapper"
)
else ()
set(_CLANG_TIDY_CMD "${_CLANG_TIDY_BIN}")
endif ()
set(CMAKE_CXX_CLANG_TIDY "${_CLANG_TIDY_CMD};--quiet")
message(STATUS "Using clang-tidy: ${CMAKE_CXX_CLANG_TIDY}")
endif ()

View File

@@ -1,8 +0,0 @@
include("${CMAKE_CURRENT_LIST_DIR}/ClioVersion.cmake")
set(CPACK_PACKAGING_INSTALL_PREFIX "/opt/clio")
set(CPACK_PACKAGE_VERSION "${CLIO_VERSION}")
set(CPACK_STRIP_FILES TRUE)
include(pkg/deb)
include(CPack)

View File

@@ -1,47 +0,0 @@
find_package(Git REQUIRED)
set(GIT_COMMAND describe --tags --exact-match)
execute_process(
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND}
WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
OUTPUT_VARIABLE TAG
RESULT_VARIABLE RC
ERROR_VARIABLE ERR
OUTPUT_STRIP_TRAILING_WHITESPACE ERROR_STRIP_TRAILING_WHITESPACE
)
if (RC EQUAL 0)
message(STATUS "Found tag '${TAG}' in git. Will use it as Clio version")
set(CLIO_VERSION "${TAG}")
set(DOC_CLIO_VERSION "${TAG}")
else ()
message(STATUS "Error finding tag in git: ${ERR}")
message(STATUS "Will use 'YYYYMMDDHMS-<branch>-<git-rev>' as Clio version")
set(GIT_COMMAND show -s --date=format:%Y%m%d%H%M%S --format=%cd)
execute_process(
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE DATE
OUTPUT_STRIP_TRAILING_WHITESPACE COMMAND_ERROR_IS_FATAL ANY
)
set(GIT_COMMAND branch --show-current)
execute_process(
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE BRANCH
OUTPUT_STRIP_TRAILING_WHITESPACE COMMAND_ERROR_IS_FATAL ANY
)
set(GIT_COMMAND rev-parse --short HEAD)
execute_process(
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE REV
OUTPUT_STRIP_TRAILING_WHITESPACE COMMAND_ERROR_IS_FATAL ANY
)
set(CLIO_VERSION "${DATE}-${BRANCH}-${REV}")
set(DOC_CLIO_VERSION "develop")
endif ()
if (CMAKE_BUILD_TYPE MATCHES Debug)
set(CLIO_VERSION "${CLIO_VERSION}+DEBUG")
endif ()
message(STATUS "Build version: ${CLIO_VERSION}")

View File

@@ -1,361 +0,0 @@
# Copyright (c) 2012 - 2017, Lars Bilke All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification, are permitted provided that the
# following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following
# disclaimer.
#
# 1. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided with the distribution.
#
# 1. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote
# products derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# CHANGES:
#
# 2012-01-31, Lars Bilke - Enable Code Coverage
#
# 2013-09-17, Joakim Söderberg - Added support for Clang. - Some additional usage instructions.
#
# 2016-02-03, Lars Bilke - Refactored functions to use named parameters
#
# 2017-06-02, Lars Bilke - Merged with modified version from github.com/ufz/ogs
#
# 2019-05-06, Anatolii Kurotych - Remove unnecessary --coverage flag
#
# 2019-12-13, FeRD (Frank Dana) - Deprecate COVERAGE_LCOVR_EXCLUDES and COVERAGE_GCOVR_EXCLUDES lists in favor of
# tool-agnostic COVERAGE_EXCLUDES variable, or EXCLUDE setup arguments. - CMake 3.4+: All excludes can be specified
# relative to BASE_DIRECTORY - All setup functions: accept BASE_DIRECTORY, EXCLUDE list - Set lcov basedir with -b
# argument - Add automatic --demangle-cpp in lcovr, if 'c++filt' is available (can be overridden with NO_DEMANGLE option
# in setup_target_for_coverage_lcovr().) - Delete output dir, .info file on 'make clean' - Remove Python detection,
# since version mismatches will break gcovr - Minor cleanup (lowercase function names, update examples...)
#
# 2019-12-19, FeRD (Frank Dana) - Rename Lcov outputs, make filtered file canonical, fix cleanup for targets
#
# 2020-01-19, Bob Apthorpe - Added gfortran support
#
# 2020-02-17, FeRD (Frank Dana) - Make all add_custom_target()s VERBATIM to auto-escape wildcard characters in EXCLUDEs,
# and remove manual escaping from gcovr targets
#
# 2021-01-19, Robin Mueller - Add CODE_COVERAGE_VERBOSE option which will allow to print out commands which are run -
# Added the option for users to set the GCOVR_ADDITIONAL_ARGS variable to supply additional flags to the gcovr command
#
# 2020-05-04, Mihchael Davis - Add -fprofile-abs-path to make gcno files contain absolute paths - Fix BASE_DIRECTORY not
# working when defined - Change BYPRODUCT from folder to index.html to stop ninja from complaining about double defines
#
# 2021-05-10, Martin Stump - Check if the generator is multi-config before warning about non-Debug builds
#
# 2022-02-22, Marko Wehle - Change gcovr output from -o <filename> for --xml <filename> and --html <filename> output
# respectively. This will allow for Multiple Output Formats at the same time by making use of GCOVR_ADDITIONAL_ARGS,
# e.g. GCOVR_ADDITIONAL_ARGS "--txt".
#
# 2022-09-28, Sebastian Mueller - fix append_coverage_compiler_flags_to_target to correctly add flags - replace
# "-fprofile-arcs -ftest-coverage" with "--coverage" (equivalent)
#
# 2023-12-15, Bronek Kozicki - remove setup_target_for_coverage_lcov (slow) and setup_target_for_coverage_fastcov (no
# support for Clang) - fix Clang support by adding find_program( ... llvm-cov ) - add Apple Clang support by adding
# execute_process( COMMAND xcrun -f llvm-cov ... ) - add CODE_COVERAGE_GCOV_TOOL to explicitly select gcov tool and
# disable find_program - replace both functions setup_target_for_coverage_gcovr_* with single
# setup_target_for_coverage_gcovr - add support for all gcovr output formats
#
# USAGE:
#
# 1. Copy this file into your cmake modules path.
#
# 1. Add the following line to your CMakeLists.txt (best inside an if-condition using a CMake option() to enable it just
# optionally): include(CodeCoverage)
#
# 1. Append necessary compiler flags for all supported source files: append_coverage_compiler_flags() Or for specific
# target: append_coverage_compiler_flags_to_target(YOUR_TARGET_NAME)
#
# 3.a (OPTIONAL) Set appropriate optimization flags, e.g. -O0, -O1 or -Og
#
# 1. If you need to exclude additional directories from the report, specify them using full paths in the
# COVERAGE_EXCLUDES variable before calling setup_target_for_coverage_*(). Example: set(COVERAGE_EXCLUDES
# '${PROJECT_SOURCE_DIR}/src/dir1/*'
# '/path/to/my/src/dir2/*') Or, use the EXCLUDE argument to setup_target_for_coverage_*(). Example:
# setup_target_for_coverage_gcovr( NAME coverage EXECUTABLE testrunner EXCLUDE "${PROJECT_SOURCE_DIR}/src/dir1/*"
# "/path/to/my/src/dir2/*")
#
# 4.a NOTE: With CMake 3.4+, COVERAGE_EXCLUDES or EXCLUDE can also be set relative to the BASE_DIRECTORY (default:
# PROJECT_SOURCE_DIR) Example: set(COVERAGE_EXCLUDES "dir1/*") setup_target_for_coverage_gcovr( NAME coverage EXECUTABLE
# testrunner FORMAT html-details BASE_DIRECTORY "${PROJECT_SOURCE_DIR}/src" EXCLUDE "dir2/*")
#
# 4.b If you need to pass specific options to gcovr, specify them in GCOVR_ADDITIONAL_ARGS variable. Example: set
# (GCOVR_ADDITIONAL_ARGS --exclude-throw-branches --exclude-noncode-lines -s) setup_target_for_coverage_gcovr( NAME
# coverage EXECUTABLE testrunner EXCLUDE "src/dir1" "src/dir2")
#
# 1. Use the functions described below to create a custom make target which runs your test executable and produces a code
# coverage report.
#
# 1. Build a Debug build: cmake -DCMAKE_BUILD_TYPE=Debug .. make make my_coverage_target
include(CMakeParseArguments)
option(CODE_COVERAGE_VERBOSE "Verbose information" FALSE)
# Check prereqs
find_program(GCOVR_PATH gcovr PATHS ${CMAKE_SOURCE_DIR}/scripts/test)
if (DEFINED CODE_COVERAGE_GCOV_TOOL)
set(GCOV_TOOL "${CODE_COVERAGE_GCOV_TOOL}")
elseif (DEFINED ENV{CODE_COVERAGE_GCOV_TOOL})
set(GCOV_TOOL "$ENV{CODE_COVERAGE_GCOV_TOOL}")
elseif ("${CMAKE_CXX_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
if (APPLE)
execute_process(COMMAND xcrun -f llvm-cov OUTPUT_VARIABLE LLVMCOV_PATH OUTPUT_STRIP_TRAILING_WHITESPACE)
else ()
find_program(LLVMCOV_PATH llvm-cov)
endif ()
if (LLVMCOV_PATH)
set(GCOV_TOOL "${LLVMCOV_PATH} gcov")
endif ()
elseif ("${CMAKE_CXX_COMPILER_ID}" MATCHES "GNU")
find_program(GCOV_PATH gcov)
set(GCOV_TOOL "${GCOV_PATH}")
endif ()
# Check supported compiler (Clang, GNU and Flang)
get_property(LANGUAGES GLOBAL PROPERTY ENABLED_LANGUAGES)
foreach (LANG ${LANGUAGES})
if ("${CMAKE_${LANG}_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
if ("${CMAKE_${LANG}_COMPILER_VERSION}" VERSION_LESS 3)
message(FATAL_ERROR "Clang version must be 3.0.0 or greater! Aborting...")
endif ()
elseif (NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES "GNU" AND NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES
"(LLVM)?[Ff]lang"
)
message(FATAL_ERROR "Compiler is not GNU or Flang! Aborting...")
endif ()
endforeach ()
set(COVERAGE_COMPILER_FLAGS "-g --coverage" CACHE INTERNAL "")
if (CMAKE_CXX_COMPILER_ID MATCHES "(GNU|Clang)")
include(CheckCXXCompilerFlag)
check_cxx_compiler_flag(-fprofile-abs-path HAVE_cxx_fprofile_abs_path)
if (HAVE_cxx_fprofile_abs_path)
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
endif ()
include(CheckCCompilerFlag)
check_c_compiler_flag(-fprofile-abs-path HAVE_c_fprofile_abs_path)
if (HAVE_c_fprofile_abs_path)
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
endif ()
endif ()
set(CMAKE_Fortran_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the Fortran compiler during coverage builds." FORCE
)
set(CMAKE_CXX_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the C++ compiler during coverage builds." FORCE
)
set(CMAKE_C_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
CACHE STRING "Flags used by the C compiler during coverage builds." FORCE
)
set(CMAKE_EXE_LINKER_FLAGS_COVERAGE "" CACHE STRING "Flags used for linking binaries during coverage builds." FORCE)
set(CMAKE_SHARED_LINKER_FLAGS_COVERAGE ""
CACHE STRING "Flags used by the shared libraries linker during coverage builds." FORCE
)
mark_as_advanced(
CMAKE_Fortran_FLAGS_COVERAGE CMAKE_CXX_FLAGS_COVERAGE CMAKE_C_FLAGS_COVERAGE CMAKE_EXE_LINKER_FLAGS_COVERAGE
CMAKE_SHARED_LINKER_FLAGS_COVERAGE
)
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
if (NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG))
message(WARNING "Code coverage results with an optimised (non-Debug) build may be misleading")
endif () # NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG)
if (CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
link_libraries(gcov)
endif ()
# Defines a target for running and collection code coverage information Builds dependencies, runs the given executable
# and outputs reports. NOTE! The executable should always have a ZERO as exit code otherwise the coverage generation
# will not complete.
#
# setup_target_for_coverage_gcovr( NAME ctest_coverage # New target name EXECUTABLE ctest -j
# ${PROCESSOR_COUNT} # Executable in PROJECT_BINARY_DIR DEPENDENCIES executable_target # Dependencies to build
# first BASE_DIRECTORY "../" # Base directory for report # (defaults to PROJECT_SOURCE_DIR) FORMAT
# "cobertura" # Output format, one of: # xml cobertura sonarqube json-summary # json-details
# coveralls csv txt # html-single html-nested html-details # (xml is an alias to cobertura; # if no format is set,
# defaults to xml) EXCLUDE "src/dir1/*" "src/dir2/*" # Patterns to exclude (can be relative # to BASE_DIRECTORY,
# with CMake 3.4+) ) The user can set the variable GCOVR_ADDITIONAL_ARGS to supply additional flags to the GCVOR
# command.
function (setup_target_for_coverage_gcovr)
set(options NONE)
set(oneValueArgs BASE_DIRECTORY NAME FORMAT)
set(multiValueArgs EXCLUDE EXECUTABLE EXECUTABLE_ARGS DEPENDENCIES)
cmake_parse_arguments(Coverage "${options}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN})
if (NOT GCOV_TOOL)
message(FATAL_ERROR "Could not find gcov or llvm-cov tool! Aborting...")
endif ()
if (NOT GCOVR_PATH)
message(FATAL_ERROR "Could not find gcovr tool! Aborting...")
endif ()
# Set base directory (as absolute path), or default to PROJECT_SOURCE_DIR
if (DEFINED Coverage_BASE_DIRECTORY)
get_filename_component(BASEDIR ${Coverage_BASE_DIRECTORY} ABSOLUTE)
else ()
set(BASEDIR ${PROJECT_SOURCE_DIR})
endif ()
if (NOT DEFINED Coverage_FORMAT)
set(Coverage_FORMAT xml)
endif ()
if ("--output" IN_LIST GCOVR_ADDITIONAL_ARGS)
message(FATAL_ERROR "Unsupported --output option detected in GCOVR_ADDITIONAL_ARGS! Aborting...")
else ()
if ((Coverage_FORMAT STREQUAL "html-details") OR (Coverage_FORMAT STREQUAL "html-nested"))
set(GCOVR_OUTPUT_FILE ${PROJECT_BINARY_DIR}/${Coverage_NAME}/index.html)
set(GCOVR_CREATE_FOLDER ${PROJECT_BINARY_DIR}/${Coverage_NAME})
elseif (Coverage_FORMAT STREQUAL "html-single")
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.html)
elseif ((Coverage_FORMAT STREQUAL "json-summary") OR (Coverage_FORMAT STREQUAL "json-details")
OR (Coverage_FORMAT STREQUAL "coveralls")
)
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.json)
elseif (Coverage_FORMAT STREQUAL "txt")
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.txt)
elseif (Coverage_FORMAT STREQUAL "csv")
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.csv)
else ()
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.xml)
endif ()
endif ()
if ((Coverage_FORMAT STREQUAL "cobertura") OR (Coverage_FORMAT STREQUAL "xml"))
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura "${GCOVR_OUTPUT_FILE}")
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura-pretty)
set(Coverage_FORMAT cobertura) # overwrite xml
elseif (Coverage_FORMAT STREQUAL "sonarqube")
list(APPEND GCOVR_ADDITIONAL_ARGS --sonarqube "${GCOVR_OUTPUT_FILE}")
elseif (Coverage_FORMAT STREQUAL "json-summary")
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary "${GCOVR_OUTPUT_FILE}")
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary-pretty)
elseif (Coverage_FORMAT STREQUAL "json-details")
list(APPEND GCOVR_ADDITIONAL_ARGS --json "${GCOVR_OUTPUT_FILE}")
list(APPEND GCOVR_ADDITIONAL_ARGS --json-pretty)
elseif (Coverage_FORMAT STREQUAL "coveralls")
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls "${GCOVR_OUTPUT_FILE}")
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls-pretty)
elseif (Coverage_FORMAT STREQUAL "csv")
list(APPEND GCOVR_ADDITIONAL_ARGS --csv "${GCOVR_OUTPUT_FILE}")
elseif (Coverage_FORMAT STREQUAL "txt")
list(APPEND GCOVR_ADDITIONAL_ARGS --txt "${GCOVR_OUTPUT_FILE}")
elseif (Coverage_FORMAT STREQUAL "html-single")
list(APPEND GCOVR_ADDITIONAL_ARGS --html "${GCOVR_OUTPUT_FILE}")
list(APPEND GCOVR_ADDITIONAL_ARGS --html-self-contained)
elseif (Coverage_FORMAT STREQUAL "html-nested")
list(APPEND GCOVR_ADDITIONAL_ARGS --html-nested "${GCOVR_OUTPUT_FILE}")
elseif (Coverage_FORMAT STREQUAL "html-details")
list(APPEND GCOVR_ADDITIONAL_ARGS --html-details "${GCOVR_OUTPUT_FILE}")
else ()
message(FATAL_ERROR "Unsupported output style ${Coverage_FORMAT}! Aborting...")
endif ()
# Collect excludes (CMake 3.4+: Also compute absolute paths)
set(GCOVR_EXCLUDES "")
foreach (EXCLUDE ${Coverage_EXCLUDE} ${COVERAGE_EXCLUDES} ${COVERAGE_GCOVR_EXCLUDES})
if (CMAKE_VERSION VERSION_GREATER 3.4)
get_filename_component(EXCLUDE ${EXCLUDE} ABSOLUTE BASE_DIR ${BASEDIR})
endif ()
list(APPEND GCOVR_EXCLUDES "${EXCLUDE}")
endforeach ()
list(REMOVE_DUPLICATES GCOVR_EXCLUDES)
# Combine excludes to several -e arguments
set(GCOVR_EXCLUDE_ARGS "")
foreach (EXCLUDE ${GCOVR_EXCLUDES})
list(APPEND GCOVR_EXCLUDE_ARGS "-e")
list(APPEND GCOVR_EXCLUDE_ARGS "${EXCLUDE}")
endforeach ()
# Set up commands which will be run to generate coverage data Run tests
set(GCOVR_EXEC_TESTS_CMD ${Coverage_EXECUTABLE} ${Coverage_EXECUTABLE_ARGS})
# Create folder
if (DEFINED GCOVR_CREATE_FOLDER)
set(GCOVR_FOLDER_CMD ${CMAKE_COMMAND} -E make_directory ${GCOVR_CREATE_FOLDER})
else ()
set(GCOVR_FOLDER_CMD echo) # dummy
endif ()
# Running gcovr
set(GCOVR_CMD
${GCOVR_PATH}
--gcov-executable
${GCOV_TOOL}
--gcov-ignore-parse-errors=negative_hits.warn_once_per_file
-r
${BASEDIR}
${GCOVR_ADDITIONAL_ARGS}
${GCOVR_EXCLUDE_ARGS}
--object-directory=${PROJECT_BINARY_DIR}
)
if (CODE_COVERAGE_VERBOSE)
message(STATUS "Executed command report")
message(STATUS "Command to run tests: ")
string(REPLACE ";" " " GCOVR_EXEC_TESTS_CMD_SPACED "${GCOVR_EXEC_TESTS_CMD}")
message(STATUS "${GCOVR_EXEC_TESTS_CMD_SPACED}")
if (NOT GCOVR_FOLDER_CMD STREQUAL "echo")
message(STATUS "Command to create a folder: ")
string(REPLACE ";" " " GCOVR_FOLDER_CMD_SPACED "${GCOVR_FOLDER_CMD}")
message(STATUS "${GCOVR_FOLDER_CMD_SPACED}")
endif ()
message(STATUS "Command to generate gcovr coverage data: ")
string(REPLACE ";" " " GCOVR_CMD_SPACED "${GCOVR_CMD}")
message(STATUS "${GCOVR_CMD_SPACED}")
endif ()
add_custom_target(
${Coverage_NAME}
COMMAND ${GCOVR_EXEC_TESTS_CMD}
COMMAND ${GCOVR_FOLDER_CMD}
COMMAND ${GCOVR_CMD}
BYPRODUCTS ${GCOVR_OUTPUT_FILE}
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
DEPENDS ${Coverage_DEPENDENCIES}
VERBATIM # Protect arguments to commands
COMMENT "Running gcovr to produce code coverage report."
)
# Show info where to find the report
add_custom_command(
TARGET ${Coverage_NAME} POST_BUILD COMMAND ;
COMMENT "Code coverage report saved in ${GCOVR_OUTPUT_FILE} formatted as ${Coverage_FORMAT}"
)
endfunction () # setup_target_for_coverage_gcovr
function (append_coverage_compiler_flags)
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
set(CMAKE_Fortran_FLAGS "${CMAKE_Fortran_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
message(STATUS "Appending code coverage compiler flags: ${COVERAGE_COMPILER_FLAGS}")
endfunction () # append_coverage_compiler_flags
# Setup coverage for specific library
function (append_coverage_compiler_flags_to_target name mode)
separate_arguments(_flag_list NATIVE_COMMAND "${COVERAGE_COMPILER_FLAGS}")
target_compile_options(${name} ${mode} ${_flag_list})
if (CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
target_link_libraries(${name} ${mode} gcov)
endif ()
endfunction ()

View File

@@ -1,20 +0,0 @@
find_package(Doxygen REQUIRED)
# See Doxyfile for these settings:
set(SOURCE ${CMAKE_CURRENT_SOURCE_DIR}/..)
set(USE_DOT "YES")
set(LINT "NO")
set(EXCLUDES "")
# ---
set(DOXYGEN_IN ${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile)
set(DOXYGEN_OUT ${CMAKE_CURRENT_BINARY_DIR}/Doxyfile)
configure_file(${DOXYGEN_IN} ${DOXYGEN_OUT})
add_custom_target(
docs
COMMAND ${DOXYGEN_EXECUTABLE} ${DOXYGEN_OUT}
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
COMMENT "Generating API documentation with Doxygen"
VERBATIM
)

View File

@@ -1,11 +0,0 @@
if (DEFINED CMAKE_LINKER_TYPE)
message(STATUS "Custom linker is already set: ${CMAKE_LINKER_TYPE}")
return()
endif ()
find_program(MOLD_PATH mold)
if (MOLD_PATH AND CMAKE_SYSTEM_NAME STREQUAL "Linux")
message(STATUS "Using Mold linker: ${MOLD_PATH}")
set(CMAKE_LINKER_TYPE MOLD)
endif ()

View File

@@ -1,82 +0,0 @@
set(COMPILER_FLAGS
-pedantic
-Wall
-Wcast-align
-Wdouble-promotion
-Werror
-Wextra
-Wformat=2
-Wimplicit-fallthrough
-Wmisleading-indentation
-Wno-dangling-else
-Wno-deprecated-declarations
-Wno-narrowing
-Wno-unused-but-set-variable
-Wnon-virtual-dtor
-Wnull-dereference
-Wold-style-cast
-Wpedantic
-Wunreachable-code
-Wunused
# FIXME: The following bunch are needed for gcc12 atm.
-Wno-missing-requires
-Wno-restrict
-Wno-null-dereference
-Wno-maybe-uninitialized
-Wno-unknown-warning-option # and this to work with clang
# TODO: Address these and others in https://github.com/XRPLF/clio/issues/1273
)
# TODO: re-enable when we change CI #884 if (is_gcc AND NOT lint) list(APPEND COMPILER_FLAGS -Wduplicated-branches
# -Wduplicated-cond -Wlogical-op -Wuseless-cast ) endif ()
if (is_clang)
list(APPEND COMPILER_FLAGS -Wshadow # gcc is to aggressive with shadowing
# https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78147
)
endif ()
if (is_appleclang)
list(APPEND COMPILER_FLAGS -Wreorder-init-list)
endif ()
if (san)
# When building with sanitizers some compilers will actually produce extra warnings/errors. We don't want this yet, at
# least not until we have fixed all runtime issues reported by the sanitizers. Once that is done we can start removing
# some of these and trying to fix it in our codebase. We can never remove all of below because most of them are
# reported from deep inside libraries like boost or libxrpl.
#
# TODO: Address in https://github.com/XRPLF/clio/issues/1885
list(
APPEND
COMPILER_FLAGS
-Wno-error=tsan # Disables treating TSAN warnings as errors
-Wno-tsan # Disables TSAN warnings (thread-safety analysis)
-Wno-uninitialized # Disables warnings about uninitialized variables (AddressSanitizer, UndefinedBehaviorSanitizer,
# etc.)
-Wno-stringop-overflow # Disables warnings about potential string operation overflows (AddressSanitizer)
-Wno-unsafe-buffer-usage # Disables warnings about unsafe memory operations (AddressSanitizer)
-Wno-frame-larger-than # Disables warnings about stack frame size being too large (AddressSanitizer)
-Wno-unused-function # Disables warnings about unused functions (LeakSanitizer, memory-related issues)
-Wno-unused-but-set-variable # Disables warnings about unused variables (MemorySanitizer)
-Wno-thread-safety-analysis # Disables warnings related to thread safety usage (ThreadSanitizer)
-Wno-thread-safety # Disables warnings related to thread safety usage (ThreadSanitizer)
-Wno-sign-compare # Disables warnings about signed/unsigned comparison (UndefinedBehaviorSanitizer)
-Wno-nonnull # Disables warnings related to null pointer dereferencing (UndefinedBehaviorSanitizer)
-Wno-address # Disables warnings about address-related issues (UndefinedBehaviorSanitizer)
-Wno-array-bounds # Disables array bounds checks (UndefinedBehaviorSanitizer)
)
endif ()
# See https://github.com/cpp-best-practices/cppbestpractices/blob/master/02-Use_the_Tools_Available.md#gcc--clang for
# the flags description
if (time_trace)
if (is_clang OR is_appleclang)
list(APPEND COMPILER_FLAGS -ftime-trace)
else ()
message(FATAL_ERROR "Clang or AppleClang is required to use `-ftime-trace`")
endif ()
endif ()
target_compile_options(clio_options INTERFACE ${COMPILER_FLAGS})

View File

@@ -1,11 +0,0 @@
include(CheckIncludeFileCXX)
check_include_file_cxx("source_location" SOURCE_LOCATION_AVAILABLE)
if (SOURCE_LOCATION_AVAILABLE)
target_compile_definitions(clio_options INTERFACE "HAS_SOURCE_LOCATION")
endif ()
check_include_file_cxx("experimental/source_location" EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
if (EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
target_compile_definitions(clio_options INTERFACE "HAS_EXPERIMENTAL_SOURCE_LOCATION")
endif ()

View File

@@ -1,4 +0,0 @@
set(Boost_USE_STATIC_LIBS ON)
set(Boost_USE_STATIC_RUNTIME ON)
find_package(Boost 1.82 REQUIRED CONFIG COMPONENTS program_options coroutine system log log_setup)

View File

@@ -1,3 +0,0 @@
find_package(OpenSSL 1.1.1 REQUIRED CONFIG)
set_target_properties(OpenSSL::SSL PROPERTIES INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2)

View File

@@ -1,2 +0,0 @@
set(THREADS_PREFER_PTHREAD_FLAG ON)
find_package(Threads REQUIRED)

View File

@@ -1 +0,0 @@
find_package(cassandra-cpp-driver REQUIRED CONFIG)

View File

@@ -1 +0,0 @@
find_package(benchmark REQUIRED CONFIG)

View File

@@ -1,4 +0,0 @@
find_package(GTest REQUIRED)
enable_testing()
include(GoogleTest)

View File

@@ -1,11 +0,0 @@
if ("${san}" STREQUAL "")
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_LINK)
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_USE_BACKTRACE)
find_package(libbacktrace REQUIRED CONFIG)
else ()
# Some sanitizers (TSAN and ASAN for sure) can't be used with libbacktrace because they have their own backtracing
# capabilities and there are conflicts. In any case, this makes sure Clio code knows that backtrace is not available.
# See relevant conan profiles for sanitizers where we disable stacktrace in Boost explicitly.
target_compile_definitions(clio_options INTERFACE CLIO_WITHOUT_STACKTRACE)
message(STATUS "Sanitizer enabled, disabling stacktrace")
endif ()

View File

@@ -1 +0,0 @@
find_package(fmt REQUIRED CONFIG)

View File

@@ -1 +0,0 @@
find_package(xrpl REQUIRED CONFIG)

View File

@@ -1,5 +0,0 @@
find_package(spdlog REQUIRED)
if (NOT TARGET spdlog::spdlog)
message(FATAL_ERROR "spdlog::spdlog target not found")
endif ()

View File

@@ -1,13 +0,0 @@
set(CLIO_INSTALL_DIR "/opt/clio")
set(CMAKE_INSTALL_PREFIX "${CLIO_INSTALL_DIR}" CACHE PATH "Install prefix" FORCE)
set(CPACK_PACKAGING_INSTALL_PREFIX "${CMAKE_INSTALL_PREFIX}")
include(GNUInstallDirs)
install(TARGETS clio_server DESTINATION "${CMAKE_INSTALL_BINDIR}")
file(READ docs/examples/config/example-config.json config)
string(REGEX REPLACE "./clio_log" "/var/log/clio/" config "${config}")
file(WRITE ${CMAKE_BINARY_DIR}/install-config.json "${config}")
install(FILES ${CMAKE_BINARY_DIR}/install-config.json DESTINATION etc RENAME config.json)

View File

@@ -1,12 +0,0 @@
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_HOMEPAGE "https://github.com/XRPLF/clio")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "Ripple Labs Inc. <support@ripple.com>")
set(CPACK_DEBIAN_FILE_NAME DEB-DEFAULT)
set(CPACK_DEBIAN_PACKAGE_SHLIBDEPS ON)
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA ${CMAKE_SOURCE_DIR}/cmake/pkg/postinst)
# We must replace "-" with "~" otherwise dpkg will sort "X.Y.Z-b1" as greater than "X.Y.Z"
string(REPLACE "-" "~" git "${CPACK_PACKAGE_VERSION}")

View File

@@ -1,46 +0,0 @@
#!/bin/sh
set -e
USER_NAME=clio
GROUP_NAME="${USER_NAME}"
CLIO_EXECUTABLE="clio_server"
CLIO_PREFIX="/opt/clio"
CLIO_BIN="$CLIO_PREFIX/bin/${CLIO_EXECUTABLE}"
CLIO_CONFIG="$CLIO_PREFIX/etc/config.json"
case "$1" in
configure)
if ! id -u "$USER_NAME" >/dev/null 2>&1; then
# Users who should not have a home directory should have their home directory set to /nonexistent
# https://www.debian.org/doc/debian-policy/ch-opersys.html#non-existent-home-directories
useradd \
--system \
--home-dir /nonexistent \
--no-create-home \
--shell /usr/sbin/nologin \
--comment "system user for ${CLIO_EXECUTABLE}" \
--user-group \
${USER_NAME}
fi
install -d -o "$USER_NAME" -g "$GROUP_NAME" /var/log/clio
if [ -f "$CLIO_CONFIG" ]; then
chown "$USER_NAME:$GROUP_NAME" "$CLIO_CONFIG"
fi
chown -R "$USER_NAME:$GROUP_NAME" "$CLIO_PREFIX"
ln -sf "$CLIO_BIN" "/usr/bin/${CLIO_EXECUTABLE}"
;;
abort-upgrade|abort-remove|abort-deconfigure)
;;
*)
echo "postinst called with unknown argument \`$1'" >&2
exit 1
;;
esac
exit 0

View File

@@ -1,58 +0,0 @@
{
"version": "0.5",
"requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
"xxhash/0.8.3#681d36a0a6111fc56e5e45ea182c19cc%1756234289.683",
"xrpl/2.6.1#973af2bf9631f239941dd9f5a100bb84%1759275059.342",
"sqlite3/3.49.1#8631739a4c9b93bd3d6b753bac548a63%1756234266.869",
"spdlog/1.15.3#3ca0e9e6b83af4d0151e26541d140c86%1754401846.61",
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1756234262.318",
"re2/20230301#dfd6e2bf050eb90ddd8729cfb4c844a4%1756234257.976",
"rapidjson/cci.20220822#1b9d8c2256876a154172dc5cfbe447c6%1754325007.656",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"openssl/1.1.1w#a8f0792d7c5121b954578a7149d23e03%1756223730.729",
"nudb/2.0.9#c62cfd501e57055a7e0d8ee3d5e5427d%1756234237.107",
"minizip/1.2.13#9e87d57804bd372d6d1e32b1871517a3%1754325004.374",
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1756234228.999",
"libuv/1.46.0#dc28c1f653fa197f00db5b577a6f6011%1754325003.592",
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1756223727.64",
"libbacktrace/cci.20210118#a7691bfccd8caaf66309df196790a5a1%1756230911.03",
"libarchive/3.8.1#5cf685686322e906cb42706ab7e099a8%1756234256.696",
"http_parser/2.9.4#98d91690d6fd021e9e624218a85d9d97%1754325001.385",
"gtest/1.14.0#f8f0757a574a8dd747d16af62d6eb1b7%1754325000.842",
"grpc/1.50.1#02291451d1e17200293a409410d1c4e1%1756234248.958",
"fmt/11.2.0#579bb2cdf4a7607621beea4eb4651e0f%1754324999.086",
"doctest/2.4.11#a4211dfc329a16ba9f280f9574025659%1756234220.819",
"date/3.0.4#f74bbba5a08fa388256688743136cb6f%1756234217.493",
"cassandra-cpp-driver/2.17.0#e50919efac8418c26be6671fd702540a%1754324997.363",
"c-ares/1.34.5#b78b91e7cfb1f11ce777a285bbf169c6%1756234217.915",
"bzip2/1.0.8#00b4a4658791c1f06914e087f0e792f5%1756234261.716",
"boost/1.83.0#5d975011d65b51abb2d2f6eb8386b368%1754325043.336",
"benchmark/1.9.4#ce4403f7a24d3e1f907cd9da4b678be4%1754578869.672",
"abseil/20230802.1#f0f91485b111dc9837a68972cb19ca7b%1756234220.907"
],
"build_requires": [
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
"cmake/3.31.8#dde3bde00bb843687e55aea5afa0e220%1756234232.89",
"b2/5.3.3#107c15377719889654eb9a162a673975%1756234226.28"
],
"python_requires": [],
"overrides": {
"boost/1.83.0": [
null,
"boost/1.83.0#5d975011d65b51abb2d2f6eb8386b368"
],
"protobuf/3.21.12": [
null,
"protobuf/3.21.12"
],
"lz4/1.9.4": [
"lz4/1.10.0"
],
"sqlite3/3.44.2": [
"sqlite3/3.49.1"
]
},
"config_requires": []
}

View File

@@ -1,74 +0,0 @@
from conan import ConanFile
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
class ClioConan(ConanFile):
name = 'clio'
license = 'ISC'
author = 'Alex Kremer <akremer@ripple.com>, John Freeman <jfreeman@ripple.com>, Ayaz Salikhov <asalikhov@ripple.com>'
url = 'https://github.com/xrplf/clio'
description = 'Clio RPC server'
settings = 'os', 'compiler', 'build_type', 'arch'
options = {}
requires = [
'boost/1.83.0',
'cassandra-cpp-driver/2.17.0',
'fmt/11.2.0',
'protobuf/3.21.12',
'grpc/1.50.1',
'openssl/1.1.1w',
'xrpl/2.6.1',
'zlib/1.3.1',
'libbacktrace/cci.20210118',
'spdlog/1.15.3',
]
default_options = {
'xrpl/*:tests': False,
'xrpl/*:rocksdb': False,
'cassandra-cpp-driver/*:shared': False,
'date/*:header_only': True,
'grpc/*:shared': False,
'grpc/*:secure': True,
'libpq/*:shared': False,
'lz4/*:shared': False,
'openssl/*:shared': False,
'protobuf/*:shared': False,
'protobuf/*:with_zlib': True,
'snappy/*:shared': False,
'gtest/*:no_main': True,
}
exports_sources = (
'CMakeLists.txt', 'cmake/*', 'src/*'
)
def requirements(self):
self.requires('gtest/1.14.0')
self.requires('benchmark/1.9.4')
def configure(self):
if self.settings.compiler == 'apple-clang':
self.options['boost'].visibility = 'global'
def layout(self):
cmake_layout(self)
# Fix this setting to follow the default introduced in Conan 1.48
# to align with our build instructions.
self.folders.generators = 'build/generators'
generators = 'CMakeDeps'
def generate(self):
tc = CMakeToolchain(self)
tc.generate()
def build(self):
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
cmake = CMake(self)
cmake.install()

Some files were not shown because too many files have changed in this diff Show More