Compare commits

..

21 Commits

Author SHA1 Message Date
Ed Hennis
e5aa605742 Set version to 2.3.0-b2 2024-07-31 17:14:04 -04:00
Bronek Kozicki
8b181ed818 Merge branch 'master' into develop 2024-07-31 15:12:15 +01:00
Ed Hennis
f5a349558e docs: Document the process for merging pull requests (#5010) 2024-07-30 20:19:03 -04:00
Scott Schurr
b9b75ddcf5 Remove unused constants from resource/Fees.h (#4856) 2024-07-30 12:47:04 -04:00
Mayukha Vadari
a39720e94a fix: change error for invalid feature param in feature RPC (#5063)
* Returns an "Invalid parameters" error if the `feature` parameter is provided and is not a string.
2024-07-30 11:18:25 -04:00
Ed Hennis
2820feb02a Ensure levelization sorting is ASCII-order across platforms (#5072) 2024-07-29 19:02:12 -04:00
Ed Hennis
8fc805d2e2 fix: Fix NuDB build error via Conan patch (#5061)
* Includes updated instructions in BUILD.md.
2024-07-29 18:14:41 -04:00
yinyiqian1
d54151e7c4 Disallow filtering account_objects by unsupported types (#5056)
* `account_objects` returns an invalid field error if `type` is not supported.
  This includes objects an account can't own, or which are unsupported by `account_objects`
* Includes:
  * Amendments
  * Directory Node
  * Fee Settings
  * Ledger Hashes
  * Negative UNL
2024-07-29 16:30:02 -04:00
Scott Schurr
21a0a64648 chore: Add comments to SignerEntries.h (#5059) 2024-07-25 17:12:59 -04:00
Scott Schurr
20707fac4a chore: Rename two files from Directory* to Dir*: (#5058)
The names of the files should reflect the name of the Dir class.

Co-authored-by: Zack Brunson <Zshooter@gmail.com>
Co-authored-by: Ed Hennis <ed@ripple.com>
2024-07-25 14:50:27 -04:00
Ed Hennis
e6ef0fc26c Set version to 2.2.1 2024-07-24 19:37:40 -04:00
John Freeman
c157816017 Use error codes throughout fast Base58 implementation 2024-07-24 19:37:34 -04:00
Mayukha Vadari
eba5d19377 Improve error handling in some RPC commands 2024-07-24 19:25:40 -04:00
Denis Angell
ad14d09a2b Update BUILD.md after PR #5052 (#5067)
* Document the need to specify "xrpld" and "tests" to build and test rippled.
2024-07-23 16:49:16 -04:00
John Freeman
f3bcc651c7 Add xrpld build option and Conan package test (#5052)
* Make xrpld target optional

* Add job to test Conan recipe

* [fold] address review comments

* [fold] Enable tests in workflows

* [fold] Rename with_xrpld option

* [fold] Fix grep expression
2024-07-11 15:04:30 -07:00
dashangcun
e8602b81fa chore: remove repeat words (#5053)
Signed-off-by: dashangcun <jchaodaohang@foxmail.com>
Co-authored-by: dashangcun <jchaodaohang@foxmail.com>
Co-authored-by: Zack Brunson <Zshooter@gmail.com>
2024-07-09 14:05:14 -07:00
yinyiqian1
0f32109993 fix CTID in tx command returns invalidParams on lowercase hex (#5049)
* fix CTID in tx command returns invalidParams on lowercase hex

* test mixed case and change auto to explicit type

* add header cctype because std::tolower is called

* remove unused local variable

* change test case comment from 'lowercase' to 'mixed case'

---------

Co-authored-by: Zack Brunson <Zshooter@gmail.com>
2024-07-05 11:10:54 -07:00
Ed Hennis
a17ccca615 Invariant: prevent a deleted account from leaving (most) artifacts on the ledger. (#4663)
* Add feature / amendment "InvariantsV1_1"

* Adds invariant AccountRootsDeletedClean:

* Checks that a deleted account doesn't leave any directly
  accessible artifacts behind.
* Always tests, but only changes the transaction result if
  featureInvariantsV1_1 is enabled.
* Unit tests.

* Resolves #4638

* [FOLD] Review feedback from @gregtatcam:

* Fix unused variable warning
* Improve Invariant test const correctness

* [FOLD] Review feedback from @mvadari:

* Centralize the account keylet function list, and some optimization

* [FOLD] Some structured binding doesn't work in clang

* [FOLD] Review feedback 2 from @mvadari:

* Clean up and clarify some comments.

* [FOLD] Change InvariantsV1_1 to unsupported

* Will allow multiple PRs to be merged over time using the same amendment.

* fixup! [FOLD] Change InvariantsV1_1 to unsupported

* [FOLD] Update and clarify some comments. No code changes.

* Move CMake directory

* Rearrange sources

* Rewrite includes

* Recompute loops

* Fix merge issue and formatting

---------

Co-authored-by: Pretty Printer <cpp@ripple.com>
2024-07-05 10:27:15 -07:00
Bronek Kozicki
7a1b238035 Bump codecov plugin version to version 4.5.0 (#5055)
This version includes fix https://github.com/codecov/codecov-action/pull/1471
which should end the codecov upload errors due to throttling.
2024-07-02 12:42:56 -07:00
yinyiqian1
e1534a3200 fix "account_nfts" with unassociated marker returning issue (#5045)
* fix "account_nfts" with unassociated marker returning issue

* create unit test for fixing nft page invalid marker not returning error

add more test

change test name

create unit test

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* fix "account_nfts" with unassociated marker returning issue

* [FOLD] accumulated review suggestions

* move BEAST check out of lambda function

---------

Authored-by: Scott Schurr <scott@ripple.com>
2024-07-02 11:58:03 -07:00
Scott Schurr
9fec615dca fixInnerObjTemplate2 amendment (#5047)
* fixInnerObjTemplate2 amendment:

Apply inner object templates to all remaining (non-AMM)
inner objects.

Adds a unit test for applying the template to sfMajorities.
Other remaining inner objects showed no problems having
templates applied.

* Move CMake directory

* Rearrange sources

* Rewrite includes

* Recompute loops

---------

Co-authored-by: Pretty Printer <cpp@ripple.com>
2024-06-27 11:52:02 -07:00
76 changed files with 1914 additions and 457 deletions

View File

@@ -20,6 +20,8 @@ runs:
${{ inputs.generator && format('-G "{0}"', inputs.generator) || '' }} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }} \
-Dtests=TRUE \
-Dxrpld=TRUE \
${{ inputs.cmake-args }} \
..
- name: build

View File

@@ -50,6 +50,8 @@ runs:
conan install \
--output-folder . \
--build missing \
--options tests=True \
--options xrpld=True \
--settings build_type=${{ inputs.configuration }} \
..
- name: upload dependencies to remote

View File

@@ -222,7 +222,7 @@ jobs:
- name: upload coverage report
uses: wandalen/wretry.action@v1.4.10
with:
action: codecov/codecov-action@v4.3.0
action: codecov/codecov-action@v4.5.0
with: |
files: coverage.xml
fail_ci_if_error: true
@@ -232,3 +232,53 @@ jobs:
token: ${{ secrets.CODECOV_TOKEN }}
attempt_limit: 5
attempt_delay: 210000 # in milliseconds
conan:
needs: dependencies
runs-on: [self-hosted, heavy]
container: rippleci/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
configuration: Release
steps:
- name: download cache
uses: actions/download-artifact@v3
with:
name: linux-gcc-${{ env.configuration }}
- name: extract cache
run: |
mkdir -p ~/.conan
tar -xzf conan.tar -C ~/.conan
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@v4
- name: dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
with:
configuration: ${{ env.configuration }}
- name: export
run: |
version=$(conan inspect --raw version .)
reference="xrpl/${version}@local/test"
conan remove -f ${reference} || true
conan export . local/test
echo "reference=${reference}" >> "${GITHUB_ENV}"
- name: build
run: |
cd examples/example
mkdir ${build_dir}
cd ${build_dir}
conan install .. --output-folder . \
--require-override ${reference} --build missing
cmake .. \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=./build/${configuration}/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${configuration}
cmake --build .
./example | grep '^[[:digit:]]\+\.[[:digit:]]\+\.[[:digit:]]\+'

View File

@@ -98,6 +98,12 @@ Update the compiler settings:
conan profile update settings.compiler.cppstd=20 default
```
Configure Conan (1.x only) to use recipe revisions:
```
conan config set general.revisions_enabled=1
```
**Linux** developers will commonly have a default Conan [profile][] that compiles
with GCC and links with libstdc++.
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
@@ -187,6 +193,17 @@ It patches their CMake to correctly import its dependencies.
conan export --version 4.0.3 external/soci
```
Export our [Conan recipe for NuDB](./external/nudb).
It fixes some source files to add missing `#include`s.
```
# Conan 1.x
conan export external/nudb nudb/2.0.8@
# Conan 2.x
conan export --version 2.0.8 external/nudb
```
### Build and Test
1. Create a build directory and move into it.
@@ -242,7 +259,7 @@ It patches their CMake to correctly import its dependencies.
Single-config generators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dxrpld=ON -Dtests=ON ..
```
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
@@ -252,7 +269,7 @@ It patches their CMake to correctly import its dependencies.
Multi-config generators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -Dxrpld=ON -Dtests=ON ..
```
**Note:** You can pass build options for `rippled` in this step.
@@ -343,7 +360,7 @@ Example use with some cmake variables set:
```
cd .build
conan install .. --output-folder . --build missing --settings build_type=Debug
cmake -DCMAKE_BUILD_TYPE=Debug -Dcoverage=ON -Dcoverage_test_parallelism=2 -Dcoverage_format=html-details -Dcoverage_extra_args="--json coverage.json" -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
cmake -DCMAKE_BUILD_TYPE=Debug -Dcoverage=ON -Dxrpld=ON -Dtests=ON -Dcoverage_test_parallelism=2 -Dcoverage_format=html-details -Dcoverage_extra_args="--json coverage.json" -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
cmake --build . --target coverage
```

View File

@@ -13,6 +13,9 @@ then
git clean -ix
fi
# Ensure all sorting is ASCII-order consistently across platforms.
export LANG=C
rm -rfv results
mkdir results
includes="$( pwd )/results/rawincludes.txt"

View File

@@ -73,27 +73,235 @@ The source must be formatted according to the style guide below.
Header includes must be [levelized](./Builds/levelization).
Changes should be usually squashed down into a single commit.
Some larger or more complicated change sets make more sense,
and are easier to review if organized into multiple logical commits.
Either way, all commits should fit the following criteria:
* Changes should be presented in a single commit or a logical
sequence of commits.
Specifically, chronological commits that simply
reflect the history of how the author implemented
the change, "warts and all", are not useful to
reviewers.
* Every commit should have a [good message](#good-commit-messages).
to explain a specific aspects of the change.
* Every commit should be signed.
* Every commit should be well-formed (builds successfully,
unit tests passing), as this helps to resolve merge
conflicts, and makes it easier to use `git bisect`
to find bugs.
### Good commit messages
Refer to
["How to Write a Git Commit Message"](https://cbea.ms/git-commit/)
for general rules on writing a good commit message.
In addition to those guidelines, please add one of the following
prefixes to the subject line if appropriate.
* `fix:` - The primary purpose is to fix an existing bug.
* `perf:` - The primary purpose is performance improvements.
* `refactor:` - The changes refactor code without affecting
functionality.
* `test:` - The changes _only_ affect unit tests.
* `docs:` - The changes _only_ affect documentation. This can
include code comments in addition to `.md` files like this one.
* `build:` - The changes _only_ affect the build process,
including CMake and/or Conan settings.
* `chore:` - Other tasks that don't affect the binary, but don't fit
any of the other cases. e.g. formatting, git settings, updating
Github Actions jobs.
Whenever possible, when updating commits after the PR is open, please
add the PR number to the end of the subject line. e.g. `test: Add
unit tests for Feature X (#1234)`.
## Pull requests
In general, pull requests use `develop` as the base branch.
(Hotfixes are an exception.)
Changes to pull requests must be added as new commits.
Once code reviewers have started looking at your code, please avoid
force-pushing a branch in a pull request.
If your changes are not quite ready, but you want to make it easily available
for preliminary examination or review, you can create a "Draft" pull request.
While a pull request is marked as a "Draft", you can rebase or reorganize the
commits in the pull request as desired.
Github pull requests are created as "Ready" by default, or you can mark
a "Draft" pull request as "Ready".
Once a pull request is marked as "Ready",
any changes must be added as new commits. Do not
force-push to a branch in a pull request under review.
(This includes rebasing your branch onto the updated base branch.
Use a merge operation, instead or hit the "Update branch" button
at the bottom of the Github PR page.)
This preserves the ability for reviewers to filter changes since their last
review.
A pull request must obtain **approvals from at least two reviewers** before it
can be considered for merge by a Maintainer.
A pull request must obtain **approvals from at least two reviewers**
before it can be considered for merge by a Maintainer.
Maintainers retain discretion to require more approvals if they feel the
credibility of the existing approvals is insufficient.
Pull requests must be merged by [squash-and-merge][2]
to preserve a linear history for the `develop` branch.
### When and how to merge pull requests
#### "Passed"
A pull request should only have the "Passed" label added when it
meets a few criteria:
1. It must have two approving reviews [as described
above](#pull-requests). (Exception: PRs that are deemed "trivial"
only need one approval.)
2. All CI checks must be complete and passed. (One-off failures may
be acceptable if they are related to a known issue.)
3. The PR must have a [good commit message](#good-commit-messages).
* If the PR started with a good commit message, and it doesn't
need to be updated, the author can indicate that in a comment.
* Any contributor, preferably the author, can leave a comment
suggesting a commit message.
* If the author squashes and rebases the code in preparation for
merge, they should also ensure the commit message(s) are updated
as well.
4. The PR branch must be up to date with the base branch (usually
`develop`). This is usually accomplised by merging the base branch
into the feature branch, but if the other criteria are met, the
changes can be squashed and rebased on top of the base branch.
5. Finally, and most importantly, the author of the PR must
positively indicate that the PR is ready to merge. That can be
accomplished by adding the "Passed" label if their role allows,
or by leaving a comment to the effect that the PR is ready to
merge.
Once the "Passed" label is added, a maintainer may merge the PR at
any time, so don't use it lightly.
#### Instructions for maintainers
The maintainer should double-check that the PR has met all the
necessary criteria, and can request additional information from the
owner, or additional reviews, and can always feel free to remove the
"Passed" label if appropriate. The maintainer has final say on
whether a PR gets merged, and are encouraged to communicate and
issues or concerns to other maintainers.
##### Most pull requests: "Squash and merge"
Most pull requests don't need special handling, and can simply be
merged using the "Squash and merge" button on the Github UI. Update
the suggested commit message if necessary.
##### Slightly more complicated pull requests
Some pull requests need to be pushed to `develop` as more than one
commit. There are multiple ways to accomplish this. If the author
describes a process, and it is reasonable, follow it. Otherwise, do
a fast forward only merge (`--ff-only`) on the command line and push.
Either way, check that:
* The commits are based on the current tip of `develop`.
* The commits are clean: No merge commits (except when reverse
merging), no "[FOLD]" or "fixup!" messages.
* All commits are signed. If the commits are not signed by the author, use
`git commit --amend -S` to sign them yourself.
* At least one (but preferably all) of the commits has the PR number
in the commit message.
**Never use the "Create a merge commit" or "Rebase and merge"
functions!**
##### Releases, release candidates, and betas
All releases, including release candidates and betas, are handled
differently from typical PRs. Most importantly, never use
the Github UI to merge a release.
1. There are two possible conditions that the `develop` branch will
be in when preparing a release.
1. Ready or almost ready to go: There may be one or two PRs that
need to be merged, but otherwise, the only change needed is to
update the version number in `BuildInfo.cpp`. In this case,
merge those PRs as appropriate, updating the second one, and
waiting for CI to finish in between. Then update
`BuildInfo.cpp`.
2. Several pending PRs: In this case, do not use the Github UI,
because the delays waiting for CI in between each merge will be
unnecessarily onerous. Instead, create a working branch (e.g.
`develop-next`) based off of `develop`. Squash the changes
from each PR onto the branch, one commit each (unless
more are needed), being sure to sign each commit and update
the commit message to include the PR number. You may be able
to use a fast-forward merge for the first PR. The workflow may
look something like:
```
git fetch upstream
git checkout upstream/develop
git checkout -b develop-next
# Use -S on the ff-only merge if prbranch1 isn't signed.
# Or do another branch first.
git merge --ff-only user1/prbranch1
git merge --squash user2/prbranch2
git commit -S
git merge --squash user3/prbranch3
git commit -S
[...]
git push --set-upstream origin develop-next
</pre>
```
2. Create the Pull Request with `release` as the base branch. If any
of the included PRs are still open,
[use closing keywords](https://docs.github.com/articles/closing-issues-using-keywords)
in the description to ensure they are closed when the code is
released. e.g. "Closes #1234"
3. Instead of the default template, reuse and update the message from
the previous release. Include the following verbiage somewhere in
the description:
```
The base branch is release. All releases (including betas) go in
release. This PR will be merged with --ff-only (not squashed or
rebased, and not using the GitHub UI) to both release and develop.
```
4. Sign-offs for the three platforms usually occur offline, but at
least one approval will be needed on the PR.
5. Once everything is ready to go, open a terminal, and do the
fast-forward merges manually. Do not push any branches until you
verify that all of them update correctly.
```
git fetch upstream
git checkout -b upstream--develop -t upstream/develop || git checkout upstream--develop
git reset --hard upstream/develop
# develop-next must be signed already!
git merge --ff-only origin/develop-next
git checkout -b upstream--release -t upstream/release || git checkout upstream--release
git reset --hard upstream/release
git merge --ff-only origin/develop-next
# Only do these 3 steps if pushing a release. No betas or RCs
git checkout -b upstream--master -t upstream/master || git checkout upstream--master
git reset --hard upstream/master
git merge --ff-only origin/develop-next
# Check that all of the branches are updated
git log -1 --oneline
# The output should look like:
# 02ec8b7962 (HEAD -> upstream--master, origin/develop-next, upstream--release, upstream--develop, develop-next) Set version to 2.2.0-rc1
# Note that all of the upstream--develop/release/master are on this commit.
# (Master will be missing for betas, etc.)
# Just to be safe, do a dry run first:
git push --dry-run upstream-push HEAD:develop
git push --dry-run upstream-push HEAD:release
# git push --dry-run upstream-push HEAD:master
# Now push
git push upstream-push HEAD:develop
git push upstream-push HEAD:release
# git push upstream-push HEAD:master
# Don't forget to tag the release, too.
git tag <version number>
git push upstream-push <version number>
```
6. Finally
[create a new release on Github](https://github.com/XRPLF/rippled/releases).
# Style guide

View File

@@ -91,56 +91,59 @@ target_link_libraries(xrpl.libxrpl
xxHash::xxhash
)
add_executable(rippled)
if(unity)
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
endif()
if(tests)
target_compile_definitions(rippled PUBLIC ENABLE_TESTS)
endif()
target_include_directories(rippled
PRIVATE
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
)
if(xrpld)
add_executable(rippled)
if(unity)
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
endif()
if(tests)
target_compile_definitions(rippled PUBLIC ENABLE_TESTS)
endif()
target_include_directories(rippled
PRIVATE
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
)
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/src/xrpld/*.cpp"
)
target_sources(rippled PRIVATE ${sources})
if(tests)
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/src/test/*.cpp"
"${CMAKE_CURRENT_SOURCE_DIR}/src/xrpld/*.cpp"
)
target_sources(rippled PRIVATE ${sources})
endif()
target_link_libraries(rippled
Ripple::boost
Ripple::opts
Ripple::libs
xrpl.libxrpl
)
exclude_if_included(rippled)
# define a macro for tests that might need to
# be exluded or run differently in CI environment
if(is_ci)
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
endif ()
if(tests)
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/src/test/*.cpp"
)
target_sources(rippled PRIVATE ${sources})
endif()
if(reporting)
set(suffix -reporting)
set_target_properties(rippled PROPERTIES OUTPUT_NAME rippled-reporting)
get_target_property(BIN_NAME rippled OUTPUT_NAME)
message(STATUS "Reporting mode build: rippled renamed ${BIN_NAME}")
target_compile_definitions(rippled PRIVATE RIPPLED_REPORTING)
endif()
target_link_libraries(rippled
Ripple::boost
Ripple::opts
Ripple::libs
xrpl.libxrpl
)
exclude_if_included(rippled)
# define a macro for tests that might need to
# be exluded or run differently in CI environment
if(is_ci)
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
endif ()
# any files that don't play well with unity should be added here
if(tests)
set_source_files_properties(
# these two seem to produce conflicts in beast teardown template methods
src/test/rpc/ValidatorRPC_test.cpp
src/test/rpc/ShardArchiveHandler_test.cpp
PROPERTIES SKIP_UNITY_BUILD_INCLUSION TRUE)
if(reporting)
set(suffix -reporting)
set_target_properties(rippled PROPERTIES OUTPUT_NAME rippled-reporting)
get_target_property(BIN_NAME rippled OUTPUT_NAME)
message(STATUS "Reporting mode build: rippled renamed ${BIN_NAME}")
target_compile_definitions(rippled PRIVATE RIPPLED_REPORTING)
endif()
# any files that don't play well with unity should be added here
if(tests)
set_source_files_properties(
# these two seem to produce conflicts in beast teardown template methods
src/test/rpc/ValidatorRPC_test.cpp
src/test/rpc/ShardArchiveHandler_test.cpp
src/test/ledger/Invariants_test.cpp
PROPERTIES SKIP_UNITY_BUILD_INCLUSION TRUE)
endif()
endif()

View File

@@ -21,6 +21,13 @@ install(
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}"
)
if(NOT WIN32)
install(
CODE "file(CREATE_LINK xrpl \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple SYMBOLIC)"
)
endif()
install (EXPORT RippleExports
FILE RippleTargets.cmake
NAMESPACE Ripple::
@@ -31,14 +38,9 @@ write_basic_package_version_file (
VERSION ${rippled_version}
COMPATIBILITY SameMajorVersion)
if (is_root_project)
if (is_root_project AND TARGET rippled)
install (TARGETS rippled RUNTIME DESTINATION bin)
set_target_properties(rippled PROPERTIES INSTALL_RPATH_USE_LINK_PATH ON)
install (
FILES
${CMAKE_CURRENT_SOURCE_DIR}/cmake/RippleConfig.cmake
${CMAKE_CURRENT_BINARY_DIR}/RippleConfigVersion.cmake
DESTINATION lib/cmake/ripple)
# sample configs should not overwrite existing files
# install if-not-exists workaround as suggested by
# https://cmake.org/Bug/view.php?id=12646
@@ -53,18 +55,16 @@ if (is_root_project)
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/rippled-example.cfg\" etc rippled.cfg)
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/validators-example.txt\" etc validators.txt)
")
if(NOT WIN32)
install(
CODE "file(CREATE_LINK rippled${suffix} \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix} SYMBOLIC)"
)
endif()
endif ()
if(NOT WIN32)
install(
CODE "file(CREATE_LINK xrpl \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple SYMBOLIC)"
)
endif()
if(NOT WIN32)
install(
CODE "file(CREATE_LINK rippled${suffix} \
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix} SYMBOLIC)"
)
endif()
install (
FILES
${CMAKE_CURRENT_SOURCE_DIR}/cmake/RippleConfig.cmake
${CMAKE_CURRENT_BINARY_DIR}/RippleConfigVersion.cmake
DESTINATION lib/cmake/ripple)

View File

@@ -8,6 +8,8 @@ ProcessorCount(PROCESSOR_COUNT)
option(assert "Enables asserts, even in release builds" OFF)
option(xrpld "Build xrpld" ON)
option(reporting "Build rippled with reporting mode enabled" OFF)
option(tests "Build tests" ON)

View File

@@ -21,6 +21,7 @@ class Xrpl(ConanFile):
'static': [True, False],
'tests': [True, False],
'unity': [True, False],
'xrpld': [True, False],
}
requires = [
@@ -47,8 +48,9 @@ class Xrpl(ConanFile):
'rocksdb': True,
'shared': False,
'static': True,
'tests': True,
'tests': False,
'unity': False,
'xrpld': False,
'cassandra-cpp-driver/*:shared': False,
'cassandra-cpp-driver/*:use_atomic': None,
@@ -142,6 +144,7 @@ class Xrpl(ConanFile):
tc.variables['BUILD_SHARED_LIBS'] = self.options.shared
tc.variables['static'] = self.options.static
tc.variables['unity'] = self.options.unity
tc.variables['xrpld'] = self.options.xrpld
tc.generate()
def build(self):

View File

@@ -0,0 +1,16 @@
cmake_minimum_required(VERSION 3.21)
set(name example)
set(version 0.1.0)
project(
${name}
VERSION ${version}
LANGUAGES CXX
)
find_package(xrpl REQUIRED)
add_executable(example)
target_sources(example PRIVATE src/example.cpp)
target_link_libraries(example PRIVATE xrpl::libxrpl)

View File

@@ -0,0 +1,59 @@
from conan import ConanFile, conan_version
from conan.tools.cmake import CMake, cmake_layout
class Example(ConanFile):
def set_name(self):
if self.name is None:
self.name = 'example'
def set_version(self):
if self.version is None:
self.version = '0.1.0'
license = 'ISC'
author = 'John Freeman <jfreeman08@gmail.com>'
settings = 'os', 'compiler', 'build_type', 'arch'
options = {'shared': [True, False], 'fPIC': [True, False]}
default_options = {
'shared': False,
'fPIC': True,
'xrpl:xrpld': False,
}
requires = ['xrpl/2.2.0-rc1@jfreeman/nodestore']
generators = ['CMakeDeps', 'CMakeToolchain']
exports_sources = [
'CMakeLists.txt',
'cmake/*',
'external/*',
'include/*',
'src/*',
]
# For out-of-source build.
# https://docs.conan.io/en/latest/reference/build_helpers/cmake.html#configure
no_copy_source = True
def layout(self):
cmake_layout(self)
def config_options(self):
if self.settings.os == 'Windows':
del self.options.fPIC
def build(self):
cmake = CMake(self)
cmake.configure(variables={'BUILD_TESTING': 'NO'})
cmake.build()
def package(self):
cmake = CMake(self)
cmake.install()
def package_info(self):
path = f'{self.package_folder}/share/{self.name}/cpp_info.py'
with open(path, 'r') as file:
exec(file.read(), {}, {'self': self.cpp_info})

View File

@@ -0,0 +1,8 @@
#include <cstdio>
#include <xrpl/protocol/BuildInfo.h>
int main(int argc, char const** argv) {
std::printf("%s\n", ripple::BuildInfo::getVersionString().c_str());
return 0;
}

10
external/nudb/conandata.yml vendored Normal file
View File

@@ -0,0 +1,10 @@
sources:
"2.0.8":
url: "https://github.com/CPPAlliance/NuDB/archive/2.0.8.tar.gz"
sha256: "9b71903d8ba111cd893ab064b9a8b6ac4124ed8bd6b4f67250205bc43c7f13a8"
patches:
"2.0.8":
- patch_file: "patches/2.0.8-0001-add-include-stdexcept-for-msvc.patch"
patch_description: "Fix build for MSVC by including stdexcept"
patch_type: "portability"
patch_source: "https://github.com/cppalliance/NuDB/pull/100/files"

72
external/nudb/conanfile.py vendored Normal file
View File

@@ -0,0 +1,72 @@
import os
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get
from conan.tools.layout import basic_layout
required_conan_version = ">=1.52.0"
class NudbConan(ConanFile):
name = "nudb"
description = "A fast key/value insert-only database for SSD drives in C++11"
license = "BSL-1.0"
url = "https://github.com/conan-io/conan-center-index"
homepage = "https://github.com/CPPAlliance/NuDB"
topics = ("header-only", "KVS", "insert-only")
package_type = "header-library"
settings = "os", "arch", "compiler", "build_type"
no_copy_source = True
@property
def _min_cppstd(self):
return 11
def export_sources(self):
export_conandata_patches(self)
def layout(self):
basic_layout(self, src_folder="src")
def requirements(self):
self.requires("boost/1.83.0")
def package_id(self):
self.info.clear()
def validate(self):
if self.settings.compiler.cppstd:
check_min_cppstd(self, self._min_cppstd)
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def build(self):
apply_conandata_patches(self)
def package(self):
copy(self, "LICENSE*",
dst=os.path.join(self.package_folder, "licenses"),
src=self.source_folder)
copy(self, "*",
dst=os.path.join(self.package_folder, "include"),
src=os.path.join(self.source_folder, "include"))
def package_info(self):
self.cpp_info.bindirs = []
self.cpp_info.libdirs = []
self.cpp_info.set_property("cmake_target_name", "NuDB")
self.cpp_info.set_property("cmake_target_aliases", ["NuDB::nudb"])
self.cpp_info.set_property("cmake_find_mode", "both")
self.cpp_info.components["core"].set_property("cmake_target_name", "nudb")
self.cpp_info.components["core"].names["cmake_find_package"] = "nudb"
self.cpp_info.components["core"].names["cmake_find_package_multi"] = "nudb"
self.cpp_info.components["core"].requires = ["boost::thread", "boost::system"]
# TODO: to remove in conan v2 once cmake_find_package_* generators removed
self.cpp_info.names["cmake_find_package"] = "NuDB"
self.cpp_info.names["cmake_find_package_multi"] = "NuDB"

View File

@@ -0,0 +1,24 @@
diff --git a/include/nudb/detail/stream.hpp b/include/nudb/detail/stream.hpp
index 6c07bf1..e0ce8ed 100644
--- a/include/nudb/detail/stream.hpp
+++ b/include/nudb/detail/stream.hpp
@@ -14,6 +14,7 @@
#include <cstdint>
#include <cstring>
#include <memory>
+#include <stdexcept>
namespace nudb {
namespace detail {
diff --git a/include/nudb/impl/context.ipp b/include/nudb/impl/context.ipp
index beb7058..ffde0b3 100644
--- a/include/nudb/impl/context.ipp
+++ b/include/nudb/impl/context.ipp
@@ -9,6 +9,7 @@
#define NUDB_IMPL_CONTEXT_IPP
#include <nudb/detail/store_base.hpp>
+#include <stdexcept>
namespace nudb {

View File

@@ -526,7 +526,7 @@ static int secp256k1_ecdsa_sign_inner(const secp256k1_context* ctx, secp256k1_sc
break;
}
is_nonce_valid = secp256k1_scalar_set_b32_seckey(&non, nonce32);
/* The nonce is still secret here, but it being invalid is is less likely than 1:2^255. */
/* The nonce is still secret here, but it being invalid is less likely than 1:2^255. */
secp256k1_declassify(ctx, &is_nonce_valid, sizeof(is_nonce_valid));
if (is_nonce_valid) {
ret = secp256k1_ecdsa_sig_sign(&ctx->ecmult_gen_ctx, r, s, &sec, &msg, &non, recid);

View File

@@ -80,7 +80,7 @@ namespace detail {
// Feature.cpp. Because it's only used to reserve storage, and determine how
// large to make the FeatureBitset, it MAY be larger. It MUST NOT be less than
// the actual number of amendments. A LogicError on startup will verify this.
static constexpr std::size_t numFeatures = 76;
static constexpr std::size_t numFeatures = 78;
/** Amendments that this server supports and the default voting behavior.
Whether they are enabled depends on the Rules defined in the validated
@@ -369,6 +369,8 @@ extern uint256 const fixAMMv1_1;
extern uint256 const featureNFTokenMintOffer;
extern uint256 const fixReducedOffersV2;
extern uint256 const fixEnforceNFTokenTrustline;
extern uint256 const fixInnerObjTemplate2;
extern uint256 const featureInvariantsV1_1;
} // namespace ripple

View File

@@ -29,6 +29,7 @@
#include <xrpl/protocol/STXChainBridge.h>
#include <xrpl/protocol/Serializer.h>
#include <xrpl/protocol/UintTypes.h>
#include <xrpl/protocol/jss.h>
#include <cstdint>
namespace ripple {
@@ -306,6 +307,26 @@ getTicketIndex(AccountID const& account, std::uint32_t uSequence);
uint256
getTicketIndex(AccountID const& account, SeqProxy ticketSeq);
template <class... keyletParams>
struct keyletDesc
{
std::function<Keylet(keyletParams...)> function;
Json::StaticString expectedLEName;
bool includeInTests;
};
// This list should include all of the keylet functions that take a single
// AccountID parameter.
std::array<keyletDesc<AccountID const&>, 6> const directAccountKeylets{
{{&keylet::account, jss::AccountRoot, false},
{&keylet::ownerDir, jss::DirectoryNode, true},
{&keylet::signers, jss::SignerList, true},
// It's normally impossible to create an item at nftpage_min, but
// test it anyway, since the invariant checks for it.
{&keylet::nftpage_min, jss::NFTokenPage, true},
{&keylet::nftpage_max, jss::NFTokenPage, true},
{&keylet::did, jss::DID, true}}};
} // namespace ripple
#endif

View File

@@ -44,7 +44,6 @@
namespace ripple {
class STArray;
class Rules;
inline void
throwFieldNotFound(SField const& field)
@@ -105,7 +104,7 @@ public:
explicit STObject(SField const& name);
static STObject
makeInnerObject(SField const& name, Rules const& rules);
makeInnerObject(SField const& name);
iterator
begin() const;

View File

@@ -21,6 +21,7 @@
#define RIPPLE_PROTOCOL_B58_UTILS_H_INCLUDED
#include <xrpl/basics/contract.h>
#include <xrpl/protocol/detail/token_errors.h>
#include <boost/outcome.hpp>
#include <boost/outcome/result.hpp>
@@ -71,12 +72,12 @@ carrying_add(std::uint64_t a, std::uint64_t b)
// (i.e a[0] is the 2^0 coefficient, a[n] is the 2^(64*n) coefficient)
// panics if overflows (this is a specialized adder for b58 decoding.
// it should never overflow).
inline void
[[nodiscard]] inline TokenCodecErrc
inplace_bigint_add(std::span<std::uint64_t> a, std::uint64_t b)
{
if (a.size() <= 1)
{
ripple::LogicError("Input span too small for inplace_bigint_add");
return TokenCodecErrc::inputTooSmall;
}
std::uint64_t carry;
@@ -86,28 +87,29 @@ inplace_bigint_add(std::span<std::uint64_t> a, std::uint64_t b)
{
if (!carry)
{
return;
return TokenCodecErrc::success;
}
std::tie(v, carry) = carrying_add(v, 1);
}
if (carry)
{
LogicError("Overflow in inplace_bigint_add");
return TokenCodecErrc::overflowAdd;
}
return TokenCodecErrc::success;
}
inline void
[[nodiscard]] inline TokenCodecErrc
inplace_bigint_mul(std::span<std::uint64_t> a, std::uint64_t b)
{
if (a.empty())
{
LogicError("Empty span passed to inplace_bigint_mul");
return TokenCodecErrc::inputTooSmall;
}
auto const last_index = a.size() - 1;
if (a[last_index] != 0)
{
LogicError("Non-zero element in inplace_bigint_mul last index");
return TokenCodecErrc::inputTooLarge;
}
std::uint64_t carry = 0;
@@ -116,7 +118,9 @@ inplace_bigint_mul(std::span<std::uint64_t> a, std::uint64_t b)
std::tie(coeff, carry) = carrying_mul(coeff, b, carry);
}
a[last_index] = carry;
return TokenCodecErrc::success;
}
// divide a "big uint" value inplace and return the mod
// numerator is stored so smallest coefficients come first
[[nodiscard]] inline std::uint64_t
@@ -166,11 +170,7 @@ inplace_bigint_div_rem(std::span<uint64_t> numerator, std::uint64_t divisor)
b58_10_to_b58_be(std::uint64_t input)
{
constexpr std::uint64_t B_58_10 = 430804206899405824; // 58^10;
if (input >= B_58_10)
{
LogicError("Input to b58_10_to_b58_be equals or exceeds 58^10.");
}
assert(input < B_58_10);
constexpr std::size_t resultSize = 10;
std::array<std::uint8_t, resultSize> result{};
int i = 0;

View File

@@ -32,6 +32,7 @@ enum class TokenCodecErrc {
mismatchedTokenType,
mismatchedChecksum,
invalidEncodingChar,
overflowAdd,
unknown,
};
}

View File

@@ -25,43 +25,38 @@
namespace ripple {
namespace Resource {
// clang-format off
/** Schedule of fees charged for imposing load on the server. */
/** @{ */
extern Charge const
feeInvalidRequest; // A request that we can immediately tell is invalid
extern Charge const feeInvalidRequest; // A request that we can immediately
// tell is invalid
extern Charge const feeRequestNoReply; // A request that we cannot satisfy
extern Charge const feeInvalidSignature; // An object whose signature we had to
// check and it failed
extern Charge const feeInvalidSignature; // An object whose signature we had
// to check and it failed
extern Charge const feeUnwantedData; // Data we have no use for
extern Charge const feeBadData; // Data we have to verify before rejecting
extern Charge const feeBadData; // Data we have to verify before
// rejecting
// RPC loads
extern Charge const
feeInvalidRPC; // An RPC request that we can immediately tell is invalid.
extern Charge const feeReferenceRPC; // A default "reference" unspecified load
extern Charge const feeExceptionRPC; // An RPC load that causes an exception
extern Charge const feeLightRPC; // A normal RPC command
extern Charge const feeLowBurdenRPC; // A slightly burdensome RPC load
extern Charge const feeMediumBurdenRPC; // A somewhat burdensome RPC load
extern Charge const feeHighBurdenRPC; // A very burdensome RPC load
extern Charge const feePathFindUpdate; // An update to an existing PF request
extern Charge const feeInvalidRPC; // An RPC request that we can
// immediately tell is invalid.
extern Charge const feeReferenceRPC; // A default "reference" unspecified
// load
extern Charge const feeExceptionRPC; // RPC load that causes an exception
extern Charge const feeMediumBurdenRPC; // A somewhat burdensome RPC load
extern Charge const feeHighBurdenRPC; // A very burdensome RPC load
// Peer loads
extern Charge const feeLightPeer; // Requires no reply
extern Charge const feeLowBurdenPeer; // Quick/cheap, slight reply
extern Charge const feeMediumBurdenPeer; // Requires some work
extern Charge const feeHighBurdenPeer; // Extensive work
// Good things
extern Charge const
feeNewTrustedNote; // A new transaction/validation/proposal we trust
extern Charge const feeNewValidTx; // A new, valid transaction
extern Charge const feeSatisfiedRequest; // Data we requested
// Administrative
extern Charge const feeWarning; // The cost of receiving a warning
extern Charge const feeDrop; // The cost of being dropped for excess load
extern Charge const feeWarning; // The cost of receiving a warning
extern Charge const feeDrop; // The cost of being dropped for
// excess load
/** @} */
// clang-format on
} // namespace Resource
} // namespace ripple

View File

@@ -33,7 +33,7 @@ namespace BuildInfo {
// and follow the format described at http://semver.org/
//------------------------------------------------------------------------------
// clang-format off
char const* const versionString = "2.3.0-b1"
char const* const versionString = "2.3.0-b2"
// clang-format on
#if defined(DEBUG) || defined(SANITIZER)

View File

@@ -496,6 +496,10 @@ REGISTER_FIX (fixAMMv1_1, Supported::yes, VoteBehavior::De
REGISTER_FEATURE(NFTokenMintOffer, Supported::yes, VoteBehavior::DefaultNo);
REGISTER_FIX (fixReducedOffersV2, Supported::yes, VoteBehavior::DefaultNo);
REGISTER_FIX (fixEnforceNFTokenTrustline, Supported::yes, VoteBehavior::DefaultNo);
REGISTER_FIX (fixInnerObjTemplate2, Supported::yes, VoteBehavior::DefaultNo);
// InvariantsV1_1 will be changes to Supported::yes when all the
// invariants expected to be included under it are complete.
REGISTER_FEATURE(InvariantsV1_1, Supported::no, VoteBehavior::DefaultNo);
// The following amendments are obsolete, but must remain supported
// because they could potentially get enabled.

View File

@@ -61,10 +61,19 @@ STObject::STObject(SerialIter& sit, SField const& name, int depth) noexcept(
}
STObject
STObject::makeInnerObject(SField const& name, Rules const& rules)
STObject::makeInnerObject(SField const& name)
{
STObject obj{name};
if (rules.enabled(fixInnerObjTemplate))
// The if is complicated because inner object templates were added in
// two phases:
// 1. If there are no available Rules, then always apply the template.
// 2. fixInnerObjTemplate added templates to two AMM inner objects.
// 3. fixInnerObjTemplate2 added templates to all remaining inner objects.
std::optional<Rules> const& rules = getCurrentTransactionRules();
bool const isAMMObj = name == sfAuctionSlot || name == sfVoteEntry;
if (!rules || (rules->enabled(fixInnerObjTemplate) && isAMMObj) ||
(rules->enabled(fixInnerObjTemplate2) && !isAMMObj))
{
if (SOTemplate const* elements =
InnerObjectFormats::getInstance().findSOTemplateBySField(name))

View File

@@ -203,7 +203,8 @@ AttestationClaim::AttestationClaim(Json::Value const& v)
STObject
AttestationClaim::toSTObject() const
{
STObject o{sfXChainClaimAttestationCollectionElement};
STObject o =
STObject::makeInnerObject(sfXChainClaimAttestationCollectionElement);
addHelper(o);
o[sfXChainClaimID] = claimID;
if (dst)
@@ -345,7 +346,8 @@ AttestationCreateAccount::AttestationCreateAccount(
STObject
AttestationCreateAccount::toSTObject() const
{
STObject o{sfXChainCreateAccountAttestationCollectionElement};
STObject o = STObject::makeInnerObject(
sfXChainCreateAccountAttestationCollectionElement);
addHelper(o);
o[sfXChainAccountCreateCount] = createCount;
@@ -497,7 +499,7 @@ XChainClaimAttestation::XChainClaimAttestation(
STObject
XChainClaimAttestation::toSTObject() const
{
STObject o{sfXChainClaimProofSig};
STObject o = STObject::makeInnerObject(sfXChainClaimProofSig);
o[sfAttestationSignerAccount] =
STAccount{sfAttestationSignerAccount, keyAccount};
o[sfPublicKey] = publicKey;
@@ -609,7 +611,7 @@ XChainCreateAccountAttestation::XChainCreateAccountAttestation(
STObject
XChainCreateAccountAttestation::toSTObject() const
{
STObject o{sfXChainCreateAccountProofSig};
STObject o = STObject::makeInnerObject(sfXChainCreateAccountProofSig);
o[sfAttestationSignerAccount] =
STAccount{sfAttestationSignerAccount, keyAccount};

View File

@@ -467,6 +467,11 @@ b256_to_b58_be(std::span<std::uint8_t const> input, std::span<std::uint8_t> out)
{
continue;
}
static constexpr std::uint64_t B_58_10 = 430804206899405824; // 58^10;
if (base_58_10_coeff[i] >= B_58_10)
{
return Unexpected(TokenCodecErrc::inputTooLarge);
}
std::array<std::uint8_t, 10> const b58_be =
ripple::b58_fast::detail::b58_10_to_b58_be(base_58_10_coeff[i]);
std::size_t to_skip = 0;
@@ -565,10 +570,23 @@ b58_to_b256_be(std::string_view input, std::span<std::uint8_t> out)
for (int i = 1; i < num_b_58_10_coeffs; ++i)
{
std::uint64_t const c = b_58_10_coeff[i];
ripple::b58_fast::detail::inplace_bigint_mul(
std::span(&result[0], cur_result_size + 1), B_58_10);
ripple::b58_fast::detail::inplace_bigint_add(
std::span(&result[0], cur_result_size + 1), c);
{
auto code = ripple::b58_fast::detail::inplace_bigint_mul(
std::span(&result[0], cur_result_size + 1), B_58_10);
if (code != TokenCodecErrc::success)
{
return Unexpected(code);
}
}
{
auto code = ripple::b58_fast::detail::inplace_bigint_add(
std::span(&result[0], cur_result_size + 1), c);
if (code != TokenCodecErrc::success)
{
return Unexpected(code);
}
}
if (result[cur_result_size] != 0)
{
cur_result_size += 1;

View File

@@ -19,7 +19,7 @@
#include <test/jtx.h>
#include <xrpld/app/tx/applySteps.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/Dir.h>
#include <xrpl/protocol/Feature.h>
#include <xrpl/protocol/Indexes.h>
#include <xrpl/protocol/TxFlags.h>

View File

@@ -18,7 +18,7 @@
//==============================================================================
#include <test/jtx.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/Dir.h>
#include <xrpld/rpc/detail/RPCHelpers.h>
#include <xrpl/basics/chrono.h>
#include <xrpl/protocol/Feature.h>
@@ -873,6 +873,25 @@ struct PayChan_test : public beast::unit_test::suite
auto const chan1Str = to_string(channel(alice, bob, env.seq(alice)));
env(create(alice, bob, channelFunds, settleDelay, pk));
env.close();
{
// test account non-string
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_channels", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
{
auto const r =
env.rpc("account_channels", alice.human(), bob.human());

View File

@@ -177,6 +177,7 @@ class base58_test : public beast::unit_test::suite
constexpr std::size_t iters = 100000;
auto eng = randEngine();
std::uniform_int_distribution<std::uint64_t> dist;
std::uniform_int_distribution<std::uint64_t> dist1(1);
for (int i = 0; i < iters; ++i)
{
std::uint64_t const d = dist(eng);
@@ -209,12 +210,31 @@ class base58_test : public beast::unit_test::suite
auto const refAdd = boostBigInt + d;
b58_fast::detail::inplace_bigint_add(
auto const result = b58_fast::detail::inplace_bigint_add(
std::span<uint64_t>(bigInt.data(), bigInt.size()), d);
BEAST_EXPECT(result == TokenCodecErrc::success);
auto const foundAdd = multiprecision_utils::toBoostMP(bigInt);
BEAST_EXPECT(refAdd == foundAdd);
}
for (int i = 0; i < iters; ++i)
{
std::uint64_t const d = dist1(eng);
// Force overflow
std::vector<std::uint64_t> bigInt(
5, std::numeric_limits<std::uint64_t>::max());
auto const boostBigInt = multiprecision_utils::toBoostMP(
std::span<std::uint64_t>(bigInt.data(), bigInt.size()));
auto const refAdd = boostBigInt + d;
auto const result = b58_fast::detail::inplace_bigint_add(
std::span<uint64_t>(bigInt.data(), bigInt.size()), d);
BEAST_EXPECT(result == TokenCodecErrc::overflowAdd);
auto const foundAdd = multiprecision_utils::toBoostMP(bigInt);
BEAST_EXPECT(refAdd != foundAdd);
}
for (int i = 0; i < iters; ++i)
{
std::uint64_t const d = dist(eng);
auto bigInt = multiprecision_utils::randomBigInt(/* minSize */ 2);
@@ -226,11 +246,29 @@ class base58_test : public beast::unit_test::suite
auto const refMul = boostBigInt * d;
b58_fast::detail::inplace_bigint_mul(
auto const result = b58_fast::detail::inplace_bigint_mul(
std::span<uint64_t>(bigInt.data(), bigInt.size()), d);
BEAST_EXPECT(result == TokenCodecErrc::success);
auto const foundMul = multiprecision_utils::toBoostMP(bigInt);
BEAST_EXPECT(refMul == foundMul);
}
for (int i = 0; i < iters; ++i)
{
std::uint64_t const d = dist1(eng);
// Force overflow
std::vector<std::uint64_t> bigInt(
5, std::numeric_limits<std::uint64_t>::max());
auto const boostBigInt = multiprecision_utils::toBoostMP(
std::span<std::uint64_t>(bigInt.data(), bigInt.size()));
auto const refMul = boostBigInt * d;
auto const result = b58_fast::detail::inplace_bigint_mul(
std::span<uint64_t>(bigInt.data(), bigInt.size()), d);
BEAST_EXPECT(result == TokenCodecErrc::inputTooLarge);
auto const foundMul = multiprecision_utils::toBoostMP(bigInt);
BEAST_EXPECT(refMul != foundMul);
}
}
void

View File

@@ -40,7 +40,7 @@ namespace csf {
level when adding to the simulation.
The example code below demonstrates the reason for storing the collector
as a reference. The collector's lifetime will generally be be longer than
as a reference. The collector's lifetime will generally be longer than
the simulation; perhaps several simulations are run for a single collector
instance. The collector potentially stores lots of data as well, so the
simulation needs to point to the single instance, rather than requiring

View File

@@ -220,7 +220,7 @@ public:
@param fileName The output file (creates)
@param vertexName A invokable T vertexName(Vertex const &) that
returns the name target use for the vertex in the file
T must be be ostream-able
T must be ostream-able
*/
template <class VertexName>
void

View File

@@ -17,7 +17,6 @@
#include <test/jtx.h>
#include <xrpld/ledger/BookDirs.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/Sandbox.h>
#include <xrpl/basics/random.h>
#include <xrpl/protocol/Feature.h>

View File

@@ -18,6 +18,7 @@
//==============================================================================
#include <test/jtx.h>
#include <test/jtx/AMM.h>
#include <test/jtx/Env.h>
#include <xrpld/app/tx/apply.h>
#include <xrpld/app/tx/detail/ApplyContext.h>
@@ -30,6 +31,15 @@ namespace ripple {
class Invariants_test : public beast::unit_test::suite
{
// The optional Preclose function is used to process additional transactions
// on the ledger after creating two accounts, but before closing it, and
// before the Precheck function. These should only be valid functions, and
// not direct manipulations. Preclose is not commonly used.
using Preclose = std::function<bool(
test::jtx::Account const& a,
test::jtx::Account const& b,
test::jtx::Env& env)>;
// this is common setup/method for running a failing invariant check. The
// precheck function is used to manipulate the ApplyContext with view
// changes that will cause the check to fail.
@@ -38,22 +48,42 @@ class Invariants_test : public beast::unit_test::suite
test::jtx::Account const& b,
ApplyContext& ac)>;
/** Run a specific test case to put the ledger into a state that will be
* detected by an invariant. Simulates the actions of a transaction that
* would violate an invariant.
*
* @param expect_logs One or more messages related to the failing invariant
* that should be in the log output
* @precheck See "Precheck" above
* @fee If provided, the fee amount paid by the simulated transaction.
* @tx A mock transaction that took the actions to trigger the invariant. In
* most cases, only the type matters.
* @ters The TER results expected on the two passes of the invariant
* checker.
* @preclose See "Preclose" above. Note that @preclose runs *before*
* @precheck, but is the last parameter for historical reasons
*
*/
void
doInvariantCheck(
std::vector<std::string> const& expect_logs,
Precheck const& precheck,
XRPAmount fee = XRPAmount{},
STTx tx = STTx{ttACCOUNT_SET, [](STObject&) {}},
std::initializer_list<TER> ters = {
tecINVARIANT_FAILED,
tefINVARIANT_FAILED})
std::initializer_list<TER> ters =
{tecINVARIANT_FAILED, tefINVARIANT_FAILED},
Preclose const& preclose = {})
{
using namespace test::jtx;
Env env{*this};
FeatureBitset amendments =
supported_amendments() | featureInvariantsV1_1;
Env env{*this, amendments};
Account A1{"A1"};
Account A2{"A2"};
Account const A1{"A1"};
Account const A2{"A2"};
env.fund(XRP(1000), A1, A2);
if (preclose)
BEAST_EXPECT(preclose(A1, A2, env));
env.close();
OpenView ov{*env.current()};
@@ -162,6 +192,165 @@ class Invariants_test : public beast::unit_test::suite
STTx{ttACCOUNT_DELETE, [](STObject& tx) {}});
}
void
testAccountRootsDeletedClean()
{
using namespace test::jtx;
testcase << "account root deletion left artifact";
for (auto const& keyletInfo : directAccountKeylets)
{
// TODO: Use structured binding once LLVM 16 is the minimum
// supported version. See also:
// https://github.com/llvm/llvm-project/issues/48582
// https://github.com/llvm/llvm-project/commit/127bf44385424891eb04cff8e52d3f157fc2cb7c
if (!keyletInfo.includeInTests)
continue;
auto const& keyletfunc = keyletInfo.function;
auto const& type = keyletInfo.expectedLEName;
using namespace std::string_literals;
doInvariantCheck(
{{"account deletion left behind a "s + type.c_str() +
" object"}},
[&](Account const& A1, Account const& A2, ApplyContext& ac) {
// Add an object to the ledger for account A1, then delete
// A1
auto const a1 = A1.id();
auto const sleA1 = ac.view().peek(keylet::account(a1));
if (!sleA1)
return false;
auto const key = std::invoke(keyletfunc, a1);
auto const newSLE = std::make_shared<SLE>(key);
ac.view().insert(newSLE);
ac.view().erase(sleA1);
return true;
},
XRPAmount{},
STTx{ttACCOUNT_DELETE, [](STObject& tx) {}});
};
// NFT special case
doInvariantCheck(
{{"account deletion left behind a NFTokenPage object"}},
[&](Account const& A1, Account const&, ApplyContext& ac) {
// remove an account from the view
auto const sle = ac.view().peek(keylet::account(A1.id()));
if (!sle)
return false;
ac.view().erase(sle);
return true;
},
XRPAmount{},
STTx{ttACCOUNT_DELETE, [](STObject& tx) {}},
{tecINVARIANT_FAILED, tefINVARIANT_FAILED},
[&](Account const& A1, Account const&, Env& env) {
// Preclose callback to mint the NFT which will be deleted in
// the Precheck callback above.
env(token::mint(A1));
return true;
});
// AMM special cases
AccountID ammAcctID;
uint256 ammKey;
Issue ammIssue;
doInvariantCheck(
{{"account deletion left behind a DirectoryNode object"}},
[&](Account const& A1, Account const& A2, ApplyContext& ac) {
// Delete the AMM account without cleaning up the directory or
// deleting the AMM object
auto const sle = ac.view().peek(keylet::account(ammAcctID));
if (!sle)
return false;
BEAST_EXPECT(sle->at(~sfAMMID));
BEAST_EXPECT(sle->at(~sfAMMID) == ammKey);
ac.view().erase(sle);
return true;
},
XRPAmount{},
STTx{ttAMM_WITHDRAW, [](STObject& tx) {}},
{tecINVARIANT_FAILED, tefINVARIANT_FAILED},
[&](Account const& A1, Account const& A2, Env& env) {
// Preclose callback to create the AMM which will be partially
// deleted in the Precheck callback above.
AMM const amm(env, A1, XRP(100), A1["USD"](50));
ammAcctID = amm.ammAccount();
ammKey = amm.ammID();
ammIssue = amm.lptIssue();
return true;
});
doInvariantCheck(
{{"account deletion left behind a AMM object"}},
[&](Account const& A1, Account const& A2, ApplyContext& ac) {
// Delete all the AMM's trust lines, remove the AMM from the AMM
// account's directory (this deletes the directory), and delete
// the AMM account. Do not delete the AMM object.
auto const sle = ac.view().peek(keylet::account(ammAcctID));
if (!sle)
return false;
BEAST_EXPECT(sle->at(~sfAMMID));
BEAST_EXPECT(sle->at(~sfAMMID) == ammKey);
for (auto const& trustKeylet :
{keylet::line(ammAcctID, A1["USD"]),
keylet::line(A1, ammIssue)})
{
if (auto const line = ac.view().peek(trustKeylet); !line)
{
return false;
}
else
{
STAmount const lowLimit = line->at(sfLowLimit);
STAmount const highLimit = line->at(sfHighLimit);
BEAST_EXPECT(
trustDelete(
ac.view(),
line,
lowLimit.getIssuer(),
highLimit.getIssuer(),
ac.journal) == tesSUCCESS);
}
}
auto const ammSle = ac.view().peek(keylet::amm(ammKey));
if (!BEAST_EXPECT(ammSle))
return false;
auto const ownerDirKeylet = keylet::ownerDir(ammAcctID);
BEAST_EXPECT(ac.view().dirRemove(
ownerDirKeylet, ammSle->at(sfOwnerNode), ammKey, false));
BEAST_EXPECT(
!ac.view().exists(ownerDirKeylet) ||
ac.view().emptyDirDelete(ownerDirKeylet));
ac.view().erase(sle);
return true;
},
XRPAmount{},
STTx{ttAMM_WITHDRAW, [](STObject& tx) {}},
{tecINVARIANT_FAILED, tefINVARIANT_FAILED},
[&](Account const& A1, Account const& A2, Env& env) {
// Preclose callback to create the AMM which will be partially
// deleted in the Precheck callback above.
AMM const amm(env, A1, XRP(100), A1["USD"](50));
ammAcctID = amm.ammAccount();
ammKey = amm.ammID();
ammIssue = amm.lptIssue();
return true;
});
}
void
testTypesMatch()
{
@@ -175,7 +364,7 @@ class Invariants_test : public beast::unit_test::suite
auto const sle = ac.view().peek(keylet::account(A1.id()));
if (!sle)
return false;
auto sleNew = std::make_shared<SLE>(ltTICKET, sle->key());
auto const sleNew = std::make_shared<SLE>(ltTICKET, sle->key());
ac.rawView().rawReplace(sleNew);
return true;
});
@@ -191,7 +380,7 @@ class Invariants_test : public beast::unit_test::suite
// make a dummy escrow ledger entry, then change the type to an
// unsupported value so that the valid type invariant check
// will fail.
auto sleNew = std::make_shared<SLE>(
auto const sleNew = std::make_shared<SLE>(
keylet::escrow(A1, (*sle)[sfSequence] + 2));
// We don't use ltNICKNAME directly since it's marked deprecated
@@ -231,7 +420,7 @@ class Invariants_test : public beast::unit_test::suite
auto const sle = ac.view().peek(keylet::account(A1.id()));
if (!sle)
return false;
STAmount nonNative(A2["USD"](51));
STAmount const nonNative(A2["USD"](51));
sle->setFieldAmount(sfBalance, nonNative);
ac.view().update(sle);
return true;
@@ -420,7 +609,7 @@ class Invariants_test : public beast::unit_test::suite
[](Account const&, Account const&, ApplyContext& ac) {
// Insert a new account root created by a non-payment into
// the view.
const Account A3{"A3"};
Account const A3{"A3"};
Keylet const acctKeylet = keylet::account(A3);
auto const sleNew = std::make_shared<SLE>(acctKeylet);
ac.view().insert(sleNew);
@@ -432,13 +621,13 @@ class Invariants_test : public beast::unit_test::suite
[](Account const&, Account const&, ApplyContext& ac) {
// Insert two new account roots into the view.
{
const Account A3{"A3"};
Account const A3{"A3"};
Keylet const acctKeylet = keylet::account(A3);
auto const sleA3 = std::make_shared<SLE>(acctKeylet);
ac.view().insert(sleA3);
}
{
const Account A4{"A4"};
Account const A4{"A4"};
Keylet const acctKeylet = keylet::account(A4);
auto const sleA4 = std::make_shared<SLE>(acctKeylet);
ac.view().insert(sleA4);
@@ -450,7 +639,7 @@ class Invariants_test : public beast::unit_test::suite
{{"account created with wrong starting sequence number"}},
[](Account const&, Account const&, ApplyContext& ac) {
// Insert a new account root with the wrong starting sequence.
const Account A3{"A3"};
Account const A3{"A3"};
Keylet const acctKeylet = keylet::account(A3);
auto const sleNew = std::make_shared<SLE>(acctKeylet);
sleNew->setFieldU32(sfSequence, ac.view().seq() + 1);
@@ -467,6 +656,7 @@ public:
{
testXRPNotCreated();
testAccountRootsNotRemoved();
testAccountRootsDeletedClean();
testTypesMatch();
testNoXRPTrustLine();
testXRPBalanceCheck();

View File

@@ -39,6 +39,7 @@ class AccountCurrencies_test : public beast::unit_test::suite
{ // invalid ledger (hash)
Json::Value params;
params[jss::account] = Account{"bob"}.human();
params[jss::ledger_hash] = 1;
auto const result = env.rpc(
"json",
@@ -56,6 +57,50 @@ class AccountCurrencies_test : public beast::unit_test::suite
result[jss::error_message] == "Missing field 'account'.");
}
{
// test account non-string
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json",
"account_currencies",
to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
{
// test ident non-string
auto testInvalidIdentParam = [&](auto const& param) {
Json::Value params;
params[jss::ident] = param;
auto jrr = env.rpc(
"json",
"account_currencies",
to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'ident'.");
};
testInvalidIdentParam(1);
testInvalidIdentParam(1.1);
testInvalidIdentParam(true);
testInvalidIdentParam(Json::Value(Json::nullValue));
testInvalidIdentParam(Json::Value(Json::objectValue));
testInvalidIdentParam(Json::Value(Json::arrayValue));
}
{
Json::Value params;
params[jss::account] =
@@ -198,6 +243,6 @@ public:
}
};
BEAST_DEFINE_TESTSUITE(AccountCurrencies, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountCurrencies, rpc, ripple);
} // namespace ripple

View File

@@ -36,6 +36,7 @@ public:
void
testErrors()
{
testcase("Errors");
using namespace jtx;
Env env(*this);
{
@@ -78,12 +79,53 @@ public:
BEAST_EXPECT(
info[jss::result][jss::error_message] == "Account malformed.");
}
{
// Cannot pass a non-string into the `account` param
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_info", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
{
// Cannot pass a non-string into the `ident` param
auto testInvalidIdentParam = [&](auto const& param) {
Json::Value params;
params[jss::ident] = param;
auto jrr = env.rpc(
"json", "account_info", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'ident'.");
};
testInvalidIdentParam(1);
testInvalidIdentParam(1.1);
testInvalidIdentParam(true);
testInvalidIdentParam(Json::Value(Json::nullValue));
testInvalidIdentParam(Json::Value(Json::objectValue));
testInvalidIdentParam(Json::Value(Json::arrayValue));
}
}
// Test the "signer_lists" argument in account_info.
void
testSignerLists()
{
testcase("Signer lists");
using namespace jtx;
Env env(*this);
Account const alice{"alice"};
@@ -205,6 +247,7 @@ public:
void
testSignerListsApiVersion2()
{
testcase("Signer lists APIv2");
using namespace jtx;
Env env{*this};
Account const alice{"alice"};
@@ -326,6 +369,7 @@ public:
void
testSignerListsV2()
{
testcase("Signer lists v2");
using namespace jtx;
Env env(*this);
Account const alice{"alice"};
@@ -515,6 +559,7 @@ public:
void
testAccountFlags(FeatureBitset const& features)
{
testcase("Account flags");
using namespace jtx;
Env env(*this, features);
@@ -652,7 +697,7 @@ public:
}
};
BEAST_DEFINE_TESTSUITE(AccountInfo, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountInfo, rpc, ripple);
} // namespace test
} // namespace ripple

View File

@@ -27,7 +27,7 @@ namespace ripple {
namespace RPC {
class AccountLinesRPC_test : public beast::unit_test::suite
class AccountLines_test : public beast::unit_test::suite
{
public:
void
@@ -55,6 +55,25 @@ public:
lines[jss::result][jss::error_message] ==
RPC::make_error(rpcACT_MALFORMED)[jss::error_message]);
}
{
// test account non-string
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_lines", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
Account const alice{"alice"};
{
// account_lines on an unfunded account.
@@ -1474,7 +1493,7 @@ public:
}
};
BEAST_DEFINE_TESTSUITE(AccountLinesRPC, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountLines, rpc, ripple);
} // namespace RPC
} // namespace ripple

View File

@@ -20,10 +20,12 @@
#include <test/jtx.h>
#include <test/jtx/AMM.h>
#include <test/jtx/xchain_bridge.h>
#include <xrpld/app/tx/detail/NFTokenMint.h>
#include <xrpl/json/json_reader.h>
#include <xrpl/json/json_value.h>
#include <xrpl/json/to_string.h>
#include <xrpl/protocol/jss.h>
#include <xrpl/protocol/nft.h>
#include <boost/utility/string_ref.hpp>
@@ -123,8 +125,30 @@ public:
// test error on no account
{
auto resp = env.rpc("json", "account_objects");
BEAST_EXPECT(resp[jss::error_message] == "Syntax error.");
Json::Value params;
auto resp = env.rpc("json", "account_objects", to_string(params));
BEAST_EXPECT(
resp[jss::result][jss::error_message] ==
"Missing field 'account'.");
}
// test account non-string
{
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_objects", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
// test error on malformed account string.
{
@@ -556,11 +580,11 @@ public:
Env env(*this, features);
// Make a lambda we can use to get "account_objects" easily.
auto acct_objs = [&env](
AccountID const& acct,
std::optional<Json::StaticString> const& type,
std::optional<std::uint16_t> limit = std::nullopt,
std::optional<std::string> marker = std::nullopt) {
auto acctObjs = [&env](
AccountID const& acct,
std::optional<Json::StaticString> const& type,
std::optional<std::uint16_t> limit = std::nullopt,
std::optional<std::string> marker = std::nullopt) {
Json::Value params;
params[jss::account] = to_string(acct);
if (type)
@@ -574,32 +598,42 @@ public:
};
// Make a lambda that easily identifies the size of account objects.
auto acct_objs_is_size = [](Json::Value const& resp, unsigned size) {
auto acctObjsIsSize = [](Json::Value const& resp, unsigned size) {
return resp[jss::result][jss::account_objects].isArray() &&
(resp[jss::result][jss::account_objects].size() == size);
};
// Make a lambda that checks if the response has error for invalid type
auto acctObjsTypeIsInvalid = [](Json::Value const& resp) {
return resp[jss::result].isMember(jss::error) &&
resp[jss::result][jss::error_message] ==
"Invalid field \'type\'.";
};
env.fund(XRP(10000), gw, alice);
env.close();
// Since the account is empty now, all account objects should come
// back empty.
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::account), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::amendments), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::check), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::deposit_preauth), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::directory), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::escrow), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::fee), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::hashes), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::nft_page), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::offer), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::payment_channel), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::signer_list), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::state), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::ticket), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::amm), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::did), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::account), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::check), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::deposit_preauth), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::escrow), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::nft_page), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::offer), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::payment_channel), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::signer_list), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::state), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::ticket), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::amm), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::did), 0));
// we expect invalid field type reported for the following types
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::amendments)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::directory)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::fee)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::hashes)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::NegativeUNL)));
// gw mints an NFT so we can find it.
uint256 const nftID{token::getNextID(env, gw, 0u, tfTransferable)};
@@ -607,8 +641,8 @@ public:
env.close();
{
// Find the NFToken page and make sure it's the right one.
Json::Value const resp = acct_objs(gw, jss::nft_page);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::nft_page);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& nftPage = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(nftPage[sfNFTokens.jsonName].size() == 1);
@@ -624,8 +658,8 @@ public:
env.close();
{
// Find the trustline and make sure it's the right one.
Json::Value const resp = acct_objs(gw, jss::state);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::state);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& state = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(state[sfBalance.jsonName][jss::value].asInt() == -5);
@@ -637,8 +671,8 @@ public:
env.close();
{
// Find the check.
Json::Value const resp = acct_objs(gw, jss::check);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::check);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& check = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(check[sfAccount.jsonName] == gw.human());
@@ -650,8 +684,8 @@ public:
env.close();
{
// Find the preauthorization.
Json::Value const resp = acct_objs(gw, jss::deposit_preauth);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::deposit_preauth);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& preauth = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(preauth[sfAccount.jsonName] == gw.human());
@@ -672,8 +706,8 @@ public:
}
{
// Find the escrow.
Json::Value const resp = acct_objs(gw, jss::escrow);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::escrow);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& escrow = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(escrow[sfAccount.jsonName] == gw.human());
@@ -686,7 +720,7 @@ public:
Env scEnv(*this, envconfig(port_increment, 3), features);
x.createScBridgeObjects(scEnv);
auto scenv_acct_objs = [&](Account const& acct, char const* type) {
auto scEnvAcctObjs = [&](Account const& acct, char const* type) {
Json::Value params;
params[jss::account] = acct.human();
params[jss::type] = type;
@@ -695,9 +729,9 @@ public:
};
Json::Value const resp =
scenv_acct_objs(Account::master, jss::bridge);
scEnvAcctObjs(Account::master, jss::bridge);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& acct_bridge =
resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(
@@ -733,7 +767,7 @@ public:
scEnv(xchain_create_claim_id(x.scBob, x.jvb, x.reward, x.mcBob));
scEnv.close();
auto scenv_acct_objs = [&](Account const& acct, char const* type) {
auto scEnvAcctObjs = [&](Account const& acct, char const* type) {
Json::Value params;
params[jss::account] = acct.human();
params[jss::type] = type;
@@ -744,8 +778,8 @@ public:
{
// Find the xchain sequence number for Andrea.
Json::Value const resp =
scenv_acct_objs(x.scAlice, jss::xchain_owned_claim_id);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
scEnvAcctObjs(x.scAlice, jss::xchain_owned_claim_id);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& xchain_seq =
resp[jss::result][jss::account_objects][0u];
@@ -757,8 +791,8 @@ public:
{
// and the one for Bob
Json::Value const resp =
scenv_acct_objs(x.scBob, jss::xchain_owned_claim_id);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
scEnvAcctObjs(x.scBob, jss::xchain_owned_claim_id);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& xchain_seq =
resp[jss::result][jss::account_objects][0u];
@@ -790,7 +824,7 @@ public:
x.signers[0]));
scEnv.close();
auto scenv_acct_objs = [&](Account const& acct, char const* type) {
auto scEnvAcctObjs = [&](Account const& acct, char const* type) {
Json::Value params;
params[jss::account] = acct.human();
params[jss::type] = type;
@@ -800,9 +834,9 @@ public:
{
// Find the xchain_create_account_claim_id
Json::Value const resp = scenv_acct_objs(
Json::Value const resp = scEnvAcctObjs(
Account::master, jss::xchain_owned_create_account_claim_id);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& xchain_create_account_claim_id =
resp[jss::result][jss::account_objects][0u];
@@ -821,8 +855,8 @@ public:
env.close();
{
// Find the offer.
Json::Value const resp = acct_objs(gw, jss::offer);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::offer);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& offer = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(offer[sfAccount.jsonName] == gw.human());
@@ -846,8 +880,8 @@ public:
}
{
// Find the payment channel.
Json::Value const resp = acct_objs(gw, jss::payment_channel);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::payment_channel);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& payChan = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(payChan[sfAccount.jsonName] == gw.human());
@@ -868,8 +902,8 @@ public:
}
{
// Find the DID.
Json::Value const resp = acct_objs(gw, jss::did);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::did);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& did = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(did[sfAccount.jsonName] == gw.human());
@@ -880,8 +914,8 @@ public:
env.close();
{
// Find the signer list.
Json::Value const resp = acct_objs(gw, jss::signer_list);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::signer_list);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& signerList =
resp[jss::result][jss::account_objects][0u];
@@ -896,8 +930,8 @@ public:
env.close();
{
// Find the ticket.
Json::Value const resp = acct_objs(gw, jss::ticket);
BEAST_EXPECT(acct_objs_is_size(resp, 1));
Json::Value const resp = acctObjs(gw, jss::ticket);
BEAST_EXPECT(acctObjsIsSize(resp, 1));
auto const& ticket = resp[jss::result][jss::account_objects][0u];
BEAST_EXPECT(ticket[sfAccount.jsonName] == gw.human());
@@ -925,7 +959,7 @@ public:
std::uint32_t const expectedAccountObjects{
static_cast<std::uint32_t>(std::size(expectedLedgerTypes))};
if (BEAST_EXPECT(acct_objs_is_size(resp, expectedAccountObjects)))
if (BEAST_EXPECT(acctObjsIsSize(resp, expectedAccountObjects)))
{
auto const& aobjs = resp[jss::result][jss::account_objects];
std::vector<std::string> gotLedgerTypes;
@@ -948,7 +982,7 @@ public:
params[jss::type] = jss::escrow;
auto resp = env.rpc("json", "account_objects", to_string(params));
if (BEAST_EXPECT(acct_objs_is_size(resp, 1u)))
if (BEAST_EXPECT(acctObjsIsSize(resp, 1u)))
{
auto const& aobjs = resp[jss::result][jss::account_objects];
BEAST_EXPECT(aobjs[0u]["LedgerEntryType"] == jss::Escrow);
@@ -969,7 +1003,7 @@ public:
auto expectObjects =
[&](Json::Value const& resp,
std::vector<std::string> const& types) -> bool {
if (!acct_objs_is_size(resp, types.size()))
if (!acctObjsIsSize(resp, types.size()))
return false;
std::vector<std::string> typesOut;
getTypes(resp, typesOut);
@@ -983,13 +1017,13 @@ public:
BEAST_EXPECT(lines[jss::lines].size() == 3);
// request AMM only, doesn't depend on the limit
BEAST_EXPECT(
acct_objs_is_size(acct_objs(amm.ammAccount(), jss::amm), 1));
acctObjsIsSize(acctObjs(amm.ammAccount(), jss::amm), 1));
// request first two objects
auto resp = acct_objs(amm.ammAccount(), std::nullopt, 2);
auto resp = acctObjs(amm.ammAccount(), std::nullopt, 2);
std::vector<std::string> typesOut;
getTypes(resp, typesOut);
// request next two objects
resp = acct_objs(
resp = acctObjs(
amm.ammAccount(),
std::nullopt,
10,
@@ -1003,7 +1037,7 @@ public:
jss::RippleState.c_str(),
jss::RippleState.c_str()}));
// filter by state: there are three trustlines
resp = acct_objs(amm.ammAccount(), jss::state, 10);
resp = acctObjs(amm.ammAccount(), jss::state, 10);
BEAST_EXPECT(expectObjects(
resp,
{jss::RippleState.c_str(),
@@ -1011,11 +1045,18 @@ public:
jss::RippleState.c_str()}));
// AMM account doesn't own offers
BEAST_EXPECT(
acct_objs_is_size(acct_objs(amm.ammAccount(), jss::offer), 0));
acctObjsIsSize(acctObjs(amm.ammAccount(), jss::offer), 0));
// gw account doesn't own AMM object
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::amm), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::amm), 0));
}
// we still expect invalid field type reported for the following types
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::amendments)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::directory)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::fee)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::hashes)));
BEAST_EXPECT(acctObjsTypeIsInvalid(acctObjs(gw, jss::NegativeUNL)));
// Run up the number of directory entries so gw has two
// directory nodes.
for (int d = 1'000'032; d >= 1'000'000; --d)
@@ -1025,11 +1066,158 @@ public:
}
// Verify that the non-returning types still don't return anything.
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::account), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::amendments), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::directory), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::fee), 0));
BEAST_EXPECT(acct_objs_is_size(acct_objs(gw, jss::hashes), 0));
BEAST_EXPECT(acctObjsIsSize(acctObjs(gw, jss::account), 0));
}
void
testNFTsMarker()
{
// there's some bug found in account_nfts method that it did not
// return invalid params when providing unassociated nft marker.
// this test tests both situations when providing valid nft marker
// and unassociated nft marker.
testcase("NFTsMarker");
using namespace jtx;
Env env(*this);
Account const bob{"bob"};
env.fund(XRP(10000), bob);
static constexpr unsigned nftsSize = 10;
for (unsigned i = 0; i < nftsSize; i++)
{
env(token::mint(bob, 0));
}
env.close();
// save the NFTokenIDs to use later
std::vector<Json::Value> tokenIDs;
{
Json::Value params;
params[jss::account] = bob.human();
params[jss::ledger_index] = "validated";
Json::Value const resp =
env.rpc("json", "account_nfts", to_string(params));
Json::Value const& nfts = resp[jss::result][jss::account_nfts];
for (Json::Value const& nft : nfts)
tokenIDs.push_back(nft["NFTokenID"]);
}
// this lambda function is used to check if the account_nfts method
// returns the correct token information. lastIndex is used to query the
// last marker.
auto compareNFTs = [&tokenIDs, &env, &bob](
unsigned const limit, unsigned const lastIndex) {
Json::Value params;
params[jss::account] = bob.human();
params[jss::limit] = limit;
params[jss::marker] = tokenIDs[lastIndex];
params[jss::ledger_index] = "validated";
Json::Value const resp =
env.rpc("json", "account_nfts", to_string(params));
if (resp[jss::result].isMember(jss::error))
return false;
Json::Value const& nfts = resp[jss::result][jss::account_nfts];
unsigned const nftsCount = tokenIDs.size() - lastIndex - 1 < limit
? tokenIDs.size() - lastIndex - 1
: limit;
if (nfts.size() != nftsCount)
return false;
for (unsigned i = 0; i < nftsCount; i++)
{
if (nfts[i]["NFTokenID"] != tokenIDs[lastIndex + 1 + i])
return false;
}
return true;
};
// test a valid marker which is equal to the third tokenID
BEAST_EXPECT(compareNFTs(4, 2));
// test a valid marker which is equal to the 8th tokenID
BEAST_EXPECT(compareNFTs(4, 7));
// lambda that holds common code for invalid cases.
auto testInvalidMarker = [&env, &bob](
auto marker, char const* errorMessage) {
Json::Value params;
params[jss::account] = bob.human();
params[jss::limit] = 4;
params[jss::ledger_index] = jss::validated;
params[jss::marker] = marker;
Json::Value const resp =
env.rpc("json", "account_nfts", to_string(params));
return resp[jss::result][jss::error_message] == errorMessage;
};
// test an invalid marker that is not a string
BEAST_EXPECT(
testInvalidMarker(17, "Invalid field \'marker\', not string."));
// test an invalid marker that has a non-hex character
BEAST_EXPECT(testInvalidMarker(
"00000000F51DFC2A09D62CBBA1DFBDD4691DAC96AD98B900000000000000000G",
"Invalid field \'marker\'."));
// this lambda function is used to create some fake marker using given
// taxon and sequence because we want to test some unassociated markers
// later
auto createFakeNFTMarker = [](AccountID const& issuer,
std::uint32_t taxon,
std::uint32_t tokenSeq,
std::uint16_t flags = 0,
std::uint16_t fee = 0) {
// the marker has the exact same format as an NFTokenID
return to_string(NFTokenMint::createNFTokenID(
flags, fee, issuer, nft::toTaxon(taxon), tokenSeq));
};
// test an unassociated marker which does not exist in the NFTokenIDs
BEAST_EXPECT(testInvalidMarker(
createFakeNFTMarker(bob.id(), 0x000000000, 0x00000000),
"Invalid field \'marker\'."));
// test an unassociated marker which exceeds the maximum value of the
// existing NFTokenID
BEAST_EXPECT(testInvalidMarker(
createFakeNFTMarker(bob.id(), 0xFFFFFFFF, 0xFFFFFFFF),
"Invalid field \'marker\'."));
}
void
testAccountNFTs()
{
testcase("account_nfts");
using namespace jtx;
Env env(*this);
// test validation
{
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_nfts", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
}
void
@@ -1039,10 +1227,12 @@ public:
testUnsteppedThenStepped();
testUnsteppedThenSteppedWithNFTs();
testObjectTypes();
testNFTsMarker();
testAccountNFTs();
}
};
BEAST_DEFINE_TESTSUITE(AccountObjects, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountObjects, rpc, ripple);
} // namespace test
} // namespace ripple

View File

@@ -37,6 +37,8 @@ public:
void
testNonAdminMinLimit()
{
testcase("Non-Admin Min Limit");
using namespace jtx;
Env env{*this, envconfig(no_admin)};
Account const gw("G1");
@@ -81,6 +83,9 @@ public:
void
testSequential(bool asAdmin)
{
testcase(
std::string("Sequential - ") + (asAdmin ? "admin" : "non-admin"));
using namespace jtx;
Env env{*this, asAdmin ? envconfig() : envconfig(no_admin)};
Account const gw("G1");
@@ -215,6 +220,8 @@ public:
void
testBadInput()
{
testcase("Bad input");
using namespace jtx;
Env env(*this);
Account const gw("G1");
@@ -233,6 +240,26 @@ public:
BEAST_EXPECT(jrr[jss::error_message] == "Syntax error.");
}
{
// test account non-string
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_offers", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
{
// empty string account
Json::Value jvParams;
@@ -282,7 +309,9 @@ public:
jvParams.toStyledString())[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(jrr[jss::status] == "error");
BEAST_EXPECT(jrr[jss::error_message] == "Invalid parameters.");
BEAST_EXPECTS(
jrr[jss::error_message] == "Invalid field 'marker'.",
jrr.toStyledString());
}
{
@@ -326,7 +355,7 @@ public:
}
};
BEAST_DEFINE_TESTSUITE(AccountOffers, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountOffers, rpc, ripple);
} // namespace test
} // namespace ripple

View File

@@ -109,6 +109,7 @@ class AccountTx_test : public beast::unit_test::suite
void
testParameters(unsigned int apiVersion)
{
testcase("Parameters APIv" + std::to_string(apiVersion));
using namespace test::jtx;
Env env(*this);
@@ -353,6 +354,25 @@ class AccountTx_test : public beast::unit_test::suite
env.rpc("json", "account_tx", to_string(p)),
rpcLGR_IDX_MALFORMED));
}
// test account non-string
{
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
auto jrr = env.rpc(
"json", "account_tx", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
// test binary and forward for bool/non bool values
{
Json::Value p{jParms};
@@ -388,6 +408,8 @@ class AccountTx_test : public beast::unit_test::suite
void
testContents()
{
testcase("Contents");
// Get results for all transaction types that can be associated
// with an account. Start by generating all transaction types.
using namespace test::jtx;
@@ -600,6 +622,8 @@ class AccountTx_test : public beast::unit_test::suite
void
testAccountDelete()
{
testcase("AccountDelete");
// Verify that if an account is resurrected then the account_tx RPC
// command still recovers all transactions on that account before
// and after resurrection.
@@ -740,7 +764,7 @@ public:
testAccountDelete();
}
};
BEAST_DEFINE_TESTSUITE(AccountTx, app, ripple);
BEAST_DEFINE_TESTSUITE(AccountTx, rpc, ripple);
} // namespace test
} // namespace ripple

View File

@@ -229,9 +229,28 @@ class Feature_test : public beast::unit_test::suite
using namespace test::jtx;
Env env{*this};
auto jrr = env.rpc("feature", "AllTheThings")[jss::result];
BEAST_EXPECT(jrr[jss::error] == "badFeature");
BEAST_EXPECT(jrr[jss::error_message] == "Feature unknown or invalid.");
auto testInvalidParam = [&](auto const& param) {
Json::Value params;
params[jss::feature] = param;
auto jrr =
env.rpc("json", "feature", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(jrr[jss::error_message] == "Invalid parameters.");
};
testInvalidParam(1);
testInvalidParam(1.1);
testInvalidParam(true);
testInvalidParam(Json::Value(Json::nullValue));
testInvalidParam(Json::Value(Json::objectValue));
testInvalidParam(Json::Value(Json::arrayValue));
{
auto jrr = env.rpc("feature", "AllTheThings")[jss::result];
BEAST_EXPECT(jrr[jss::error] == "badFeature");
BEAST_EXPECT(
jrr[jss::error_message] == "Feature unknown or invalid.");
}
}
void

View File

@@ -300,201 +300,214 @@ public:
{
// Put a bunch of different LedgerEntryTypes into a ledger
using namespace test::jtx;
using namespace std::chrono;
Env env{*this, envconfig(validator, "")};
Account const gw{"gateway"};
auto const USD = gw["USD"];
env.fund(XRP(100000), gw);
auto makeRequest = [&env](Json::StaticString const& type) {
Json::Value jvParams;
jvParams[jss::ledger_index] = "current";
jvParams[jss::type] = type;
return env.rpc(
"json",
"ledger_data",
boost::lexical_cast<std::string>(jvParams))[jss::result];
};
// Assert that state is an empty array.
for (auto const& type :
{jss::amendments,
jss::check,
jss::directory,
jss::offer,
jss::signer_list,
jss::state,
jss::ticket,
jss::escrow,
jss::payment_channel,
jss::deposit_preauth})
// Make sure fixInnerObjTemplate2 doesn't break amendments.
for (FeatureBitset const& features :
{supported_amendments() - fixInnerObjTemplate2,
supported_amendments() | fixInnerObjTemplate2})
{
auto const jrr = makeRequest(type);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 0));
}
using namespace std::chrono;
Env env{*this, envconfig(validator, ""), features};
int const num_accounts = 10;
Account const gw{"gateway"};
auto const USD = gw["USD"];
env.fund(XRP(100000), gw);
for (auto i = 0; i < num_accounts; i++)
{
Account const bob{std::string("bob") + std::to_string(i)};
env.fund(XRP(1000), bob);
}
env(offer(Account{"bob0"}, USD(100), XRP(100)));
env.trust(Account{"bob2"}["USD"](100), Account{"bob3"});
auto makeRequest = [&env](Json::StaticString const& type) {
Json::Value jvParams;
jvParams[jss::ledger_index] = "current";
jvParams[jss::type] = type;
return env.rpc(
"json",
"ledger_data",
boost::lexical_cast<std::string>(jvParams))[jss::result];
};
auto majorities = getMajorityAmendments(*env.closed());
for (int i = 0; i <= 256; ++i)
{
// Assert that state is an empty array.
for (auto const& type :
{jss::amendments,
jss::check,
jss::directory,
jss::offer,
jss::signer_list,
jss::state,
jss::ticket,
jss::escrow,
jss::payment_channel,
jss::deposit_preauth})
{
auto const jrr = makeRequest(type);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 0));
}
int const num_accounts = 10;
for (auto i = 0; i < num_accounts; i++)
{
Account const bob{std::string("bob") + std::to_string(i)};
env.fund(XRP(1000), bob);
}
env(offer(Account{"bob0"}, USD(100), XRP(100)));
env.trust(Account{"bob2"}["USD"](100), Account{"bob3"});
auto majorities = getMajorityAmendments(*env.closed());
for (int i = 0; i <= 256; ++i)
{
env.close();
majorities = getMajorityAmendments(*env.closed());
if (!majorities.empty())
break;
}
env(signers(
Account{"bob0"},
1,
{{Account{"bob1"}, 1}, {Account{"bob2"}, 1}}));
env(ticket::create(env.master, 1));
{
Json::Value jv;
jv[jss::TransactionType] = jss::EscrowCreate;
jv[jss::Flags] = tfUniversal;
jv[jss::Account] = Account{"bob5"}.human();
jv[jss::Destination] = Account{"bob6"}.human();
jv[jss::Amount] = XRP(50).value().getJson(JsonOptions::none);
jv[sfFinishAfter.fieldName] =
NetClock::time_point{env.now() + 10s}
.time_since_epoch()
.count();
env(jv);
}
{
Json::Value jv;
jv[jss::TransactionType] = jss::PaymentChannelCreate;
jv[jss::Flags] = tfUniversal;
jv[jss::Account] = Account{"bob6"}.human();
jv[jss::Destination] = Account{"bob7"}.human();
jv[jss::Amount] = XRP(100).value().getJson(JsonOptions::none);
jv[jss::SettleDelay] = NetClock::duration{10s}.count();
jv[sfPublicKey.fieldName] =
strHex(Account{"bob6"}.pk().slice());
jv[sfCancelAfter.fieldName] =
NetClock::time_point{env.now() + 300s}
.time_since_epoch()
.count();
env(jv);
}
env(check::create("bob6", "bob7", XRP(100)));
// bob9 DepositPreauths bob4 and bob8.
env(deposit::auth(Account{"bob9"}, Account{"bob4"}));
env(deposit::auth(Account{"bob9"}, Account{"bob8"}));
env.close();
majorities = getMajorityAmendments(*env.closed());
if (!majorities.empty())
break;
}
env(signers(
Account{"bob0"}, 1, {{Account{"bob1"}, 1}, {Account{"bob2"}, 1}}));
env(ticket::create(env.master, 1));
{
Json::Value jv;
jv[jss::TransactionType] = jss::EscrowCreate;
jv[jss::Flags] = tfUniversal;
jv[jss::Account] = Account{"bob5"}.human();
jv[jss::Destination] = Account{"bob6"}.human();
jv[jss::Amount] = XRP(50).value().getJson(JsonOptions::none);
jv[sfFinishAfter.fieldName] = NetClock::time_point{env.now() + 10s}
.time_since_epoch()
.count();
env(jv);
}
// Now fetch each type
{
Json::Value jv;
jv[jss::TransactionType] = jss::PaymentChannelCreate;
jv[jss::Flags] = tfUniversal;
jv[jss::Account] = Account{"bob6"}.human();
jv[jss::Destination] = Account{"bob7"}.human();
jv[jss::Amount] = XRP(100).value().getJson(JsonOptions::none);
jv[jss::SettleDelay] = NetClock::duration{10s}.count();
jv[sfPublicKey.fieldName] = strHex(Account{"bob6"}.pk().slice());
jv[sfCancelAfter.fieldName] = NetClock::time_point{env.now() + 300s}
.time_since_epoch()
.count();
env(jv);
}
{ // jvParams[jss::type] = "account";
auto const jrr = makeRequest(jss::account);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 12));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::AccountRoot);
}
env(check::create("bob6", "bob7", XRP(100)));
{ // jvParams[jss::type] = "amendments";
auto const jrr = makeRequest(jss::amendments);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Amendments);
}
// bob9 DepositPreauths bob4 and bob8.
env(deposit::auth(Account{"bob9"}, Account{"bob4"}));
env(deposit::auth(Account{"bob9"}, Account{"bob8"}));
env.close();
{ // jvParams[jss::type] = "check";
auto const jrr = makeRequest(jss::check);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Check);
}
// Now fetch each type
{ // jvParams[jss::type] = "directory";
auto const jrr = makeRequest(jss::directory);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 9));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::DirectoryNode);
}
{ // jvParams[jss::type] = "account";
auto const jrr = makeRequest(jss::account);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 12));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::AccountRoot);
}
{ // jvParams[jss::type] = "fee";
auto const jrr = makeRequest(jss::fee);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::FeeSettings);
}
{ // jvParams[jss::type] = "amendments";
auto const jrr = makeRequest(jss::amendments);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Amendments);
}
{ // jvParams[jss::type] = "hashes";
auto const jrr = makeRequest(jss::hashes);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 2));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::LedgerHashes);
}
{ // jvParams[jss::type] = "check";
auto const jrr = makeRequest(jss::check);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Check);
}
{ // jvParams[jss::type] = "offer";
auto const jrr = makeRequest(jss::offer);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Offer);
}
{ // jvParams[jss::type] = "directory";
auto const jrr = makeRequest(jss::directory);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 9));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::DirectoryNode);
}
{ // jvParams[jss::type] = "signer_list";
auto const jrr = makeRequest(jss::signer_list);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::SignerList);
}
{ // jvParams[jss::type] = "fee";
auto const jrr = makeRequest(jss::fee);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::FeeSettings);
}
{ // jvParams[jss::type] = "state";
auto const jrr = makeRequest(jss::state);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::RippleState);
}
{ // jvParams[jss::type] = "hashes";
auto const jrr = makeRequest(jss::hashes);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 2));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::LedgerHashes);
}
{ // jvParams[jss::type] = "ticket";
auto const jrr = makeRequest(jss::ticket);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Ticket);
}
{ // jvParams[jss::type] = "offer";
auto const jrr = makeRequest(jss::offer);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Offer);
}
{ // jvParams[jss::type] = "escrow";
auto const jrr = makeRequest(jss::escrow);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Escrow);
}
{ // jvParams[jss::type] = "signer_list";
auto const jrr = makeRequest(jss::signer_list);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::SignerList);
}
{ // jvParams[jss::type] = "payment_channel";
auto const jrr = makeRequest(jss::payment_channel);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::PayChannel);
}
{ // jvParams[jss::type] = "state";
auto const jrr = makeRequest(jss::state);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::RippleState);
}
{ // jvParams[jss::type] = "deposit_preauth";
auto const jrr = makeRequest(jss::deposit_preauth);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 2));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::DepositPreauth);
}
{ // jvParams[jss::type] = "ticket";
auto const jrr = makeRequest(jss::ticket);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Ticket);
}
{ // jvParams[jss::type] = "escrow";
auto const jrr = makeRequest(jss::escrow);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::Escrow);
}
{ // jvParams[jss::type] = "payment_channel";
auto const jrr = makeRequest(jss::payment_channel);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 1));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::PayChannel);
}
{ // jvParams[jss::type] = "deposit_preauth";
auto const jrr = makeRequest(jss::deposit_preauth);
BEAST_EXPECT(checkArraySize(jrr[jss::state], 2));
for (auto const& j : jrr[jss::state])
BEAST_EXPECT(j["LedgerEntryType"] == jss::DepositPreauth);
}
{ // jvParams[jss::type] = "misspelling";
Json::Value jvParams;
jvParams[jss::ledger_index] = "current";
jvParams[jss::type] = "misspelling";
auto const jrr = env.rpc(
"json",
"ledger_data",
boost::lexical_cast<std::string>(jvParams))[jss::result];
BEAST_EXPECT(jrr.isMember("error"));
BEAST_EXPECT(jrr["error"] == "invalidParams");
BEAST_EXPECT(jrr["error_message"] == "Invalid field 'type'.");
{ // jvParams[jss::type] = "misspelling";
Json::Value jvParams;
jvParams[jss::ledger_index] = "current";
jvParams[jss::type] = "misspelling";
auto const jrr = env.rpc(
"json",
"ledger_data",
boost::lexical_cast<std::string>(jvParams))[jss::result];
BEAST_EXPECT(jrr.isMember("error"));
BEAST_EXPECT(jrr["error"] == "invalidParams");
BEAST_EXPECT(jrr["error_message"] == "Invalid field 'type'.");
}
}
}

View File

@@ -64,6 +64,27 @@ class NoRippleCheck_test : public beast::unit_test::suite
BEAST_EXPECT(result[jss::error_message] == "Missing field 'role'.");
}
// test account non-string
{
auto testInvalidAccountParam = [&](auto const& param) {
Json::Value params;
params[jss::account] = param;
params[jss::role] = "user";
auto jrr = env.rpc(
"json", "noripple_check", to_string(params))[jss::result];
BEAST_EXPECT(jrr[jss::error] == "invalidParams");
BEAST_EXPECT(
jrr[jss::error_message] == "Invalid field 'account'.");
};
testInvalidAccountParam(1);
testInvalidAccountParam(1.1);
testInvalidAccountParam(true);
testInvalidAccountParam(Json::Value(Json::nullValue));
testInvalidAccountParam(Json::Value(Json::objectValue));
testInvalidAccountParam(Json::Value(Json::arrayValue));
}
{ // invalid role field
Json::Value params;
params[jss::account] = alice.human();
@@ -369,12 +390,12 @@ public:
}
};
BEAST_DEFINE_TESTSUITE(NoRippleCheck, app, ripple);
BEAST_DEFINE_TESTSUITE(NoRippleCheck, rpc, ripple);
// These tests that deal with limit amounts are slow because of the
// offer/account setup, so making them manual -- the additional coverage
// provided by them is minimal
BEAST_DEFINE_TESTSUITE_MANUAL_PRIO(NoRippleCheckLimits, app, ripple, 1);
BEAST_DEFINE_TESTSUITE_MANUAL_PRIO(NoRippleCheckLimits, rpc, ripple, 1);
} // namespace ripple

View File

@@ -27,6 +27,7 @@
#include <xrpl/protocol/jss.h>
#include <xrpl/protocol/serialize.h>
#include <cctype>
#include <optional>
#include <tuple>
@@ -671,6 +672,47 @@ class Transaction_test : public beast::unit_test::suite
BEAST_EXPECT(jrr[jss::hash]);
}
// test querying with mixed case ctid
{
Env env{*this, makeNetworkConfig(11111)};
std::uint32_t const netID = env.app().config().NETWORK_ID;
Account const alice = Account("alice");
Account const bob = Account("bob");
std::uint32_t const startLegSeq = env.current()->info().seq;
env.fund(XRP(10000), alice, bob);
env(pay(alice, bob, XRP(10)));
env.close();
std::string const ctid = *RPC::encodeCTID(startLegSeq, 0, netID);
auto isUpper = [](char c) { return std::isupper(c) != 0; };
// Verify that there are at least two upper case letters in ctid and
// test a mixed case
if (BEAST_EXPECT(
std::count_if(ctid.begin(), ctid.end(), isUpper) > 1))
{
// Change the first upper case letter to lower case.
std::string mixedCase = ctid;
{
auto const iter = std::find_if(
mixedCase.begin(), mixedCase.end(), isUpper);
*iter = std::tolower(*iter);
}
BEAST_EXPECT(ctid != mixedCase);
Json::Value jsonTx;
jsonTx[jss::binary] = false;
jsonTx[jss::ctid] = mixedCase;
jsonTx[jss::id] = 1;
Json::Value const jrr =
env.rpc("json", "tx", to_string(jsonTx))[jss::result];
BEAST_EXPECT(jrr[jss::ctid] == ctid);
BEAST_EXPECT(jrr[jss::hash]);
}
}
// test that if the network is 65535 the ctid is not in the response
{
Env env{*this, makeNetworkConfig(65535)};

View File

@@ -793,7 +793,7 @@ Ledger::updateNegativeUNL()
if (hasToDisable)
{
newNUnl.emplace_back(sfDisabledValidator);
newNUnl.push_back(STObject::makeInnerObject(sfDisabledValidator));
newNUnl.back().setFieldVL(
sfPublicKey, sle->getFieldVL(sfValidatorToDisable));
newNUnl.back().setFieldU32(sfFirstLedgerSequence, seq());

View File

@@ -277,9 +277,9 @@ FeeVoteImpl::doVoting(
}
// choose our positions
// TODO: Use structured binding once LLVM issue
// https://github.com/llvm/llvm-project/issues/48582
// is fixed.
// TODO: Use structured binding once LLVM 16 is the minimum supported
// version. See also: https://github.com/llvm/llvm-project/issues/48582
// https://github.com/llvm/llvm-project/commit/127bf44385424891eb04cff8e52d3f157fc2cb7c
auto const baseFee = baseFeeVote.getVotes();
auto const baseReserve = baseReserveVote.getVotes();
auto const incReserve = incReserveVote.getVotes();

View File

@@ -312,7 +312,7 @@ initializeFeeAuctionVote(
auto const& rules = view.rules();
// AMM creator gets the voting slot.
STArray voteSlots;
STObject voteEntry = STObject::makeInnerObject(sfVoteEntry, rules);
STObject voteEntry = STObject::makeInnerObject(sfVoteEntry);
if (tfee != 0)
voteEntry.setFieldU16(sfTradingFee, tfee);
voteEntry.setFieldU32(sfVoteWeight, VOTE_WEIGHT_SCALE_FACTOR);
@@ -325,7 +325,7 @@ initializeFeeAuctionVote(
if (rules.enabled(fixInnerObjTemplate) &&
!ammSle->isFieldPresent(sfAuctionSlot))
{
STObject auctionSlot = STObject::makeInnerObject(sfAuctionSlot, rules);
STObject auctionSlot = STObject::makeInnerObject(sfAuctionSlot);
ammSle->set(std::move(auctionSlot));
}
STObject& auctionSlot = ammSle->peekFieldObject(sfAuctionSlot);

View File

@@ -23,7 +23,6 @@
#include <xrpld/app/paths/detail/FlatSets.h>
#include <xrpld/app/paths/detail/Steps.h>
#include <xrpld/app/tx/detail/OfferStream.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/PaymentSandbox.h>
#include <xrpl/basics/IOUAmount.h>
#include <xrpl/basics/Log.h>

View File

@@ -104,7 +104,7 @@ applyVote(
Number den{0};
// Account already has vote entry
bool foundAccount = false;
auto const& rules = ctx_.view().rules();
// Iterate over the current vote entries and update each entry
// per current total tokens balance and each LP tokens balance.
// Find the entry with the least tokens and whether the account
@@ -120,7 +120,7 @@ applyVote(
continue;
}
auto feeVal = entry[sfTradingFee];
STObject newEntry = STObject::makeInnerObject(sfVoteEntry, rules);
STObject newEntry = STObject::makeInnerObject(sfVoteEntry);
// The account already has the vote entry.
if (account == account_)
{
@@ -159,7 +159,7 @@ applyVote(
{
auto update = [&](std::optional<std::uint8_t> const& minPos =
std::nullopt) {
STObject newEntry = STObject::makeInnerObject(sfVoteEntry, rules);
STObject newEntry = STObject::makeInnerObject(sfVoteEntry);
if (feeNew != 0)
newEntry.setFieldU16(sfTradingFee, feeNew);
newEntry.setFieldU32(

View File

@@ -300,11 +300,11 @@ Change::applyAmendment()
if (gotMajority)
{
// This amendment now has a majority
newMajorities.push_back(STObject(sfMajority));
newMajorities.push_back(STObject::makeInnerObject(sfMajority));
auto& entry = newMajorities.back();
entry.emplace_back(STUInt256(sfAmendment, amendment));
entry.emplace_back(STUInt32(
sfCloseTime, view().parentCloseTime().time_since_epoch().count()));
entry[sfAmendment] = amendment;
entry[sfCloseTime] =
view().parentCloseTime().time_since_epoch().count();
if (!ctx_.app.getAmendmentTable().isSupported(amendment))
{

View File

@@ -358,6 +358,91 @@ AccountRootsNotDeleted::finalize(
//------------------------------------------------------------------------------
void
AccountRootsDeletedClean::visitEntry(
bool isDelete,
std::shared_ptr<SLE const> const& before,
std::shared_ptr<SLE const> const&)
{
if (isDelete && before && before->getType() == ltACCOUNT_ROOT)
accountsDeleted_.emplace_back(before);
}
bool
AccountRootsDeletedClean::finalize(
STTx const& tx,
TER const result,
XRPAmount const,
ReadView const& view,
beast::Journal const& j)
{
// Always check for objects in the ledger, but to prevent differing
// transaction processing results, however unlikely, only fail if the
// feature is enabled. Enabled, or not, though, a fatal-level message will
// be logged
bool const enforce = view.rules().enabled(featureInvariantsV1_1);
auto const objectExists = [&view, enforce, &j](auto const& keylet) {
if (auto const sle = view.read(keylet))
{
// Finding the object is bad
auto const typeName = [&sle]() {
auto item =
LedgerFormats::getInstance().findByType(sle->getType());
if (item != nullptr)
return item->getName();
return std::to_string(sle->getType());
}();
JLOG(j.fatal())
<< "Invariant failed: account deletion left behind a "
<< typeName << " object";
(void)enforce;
assert(enforce);
return true;
}
return false;
};
for (auto const& accountSLE : accountsDeleted_)
{
auto const accountID = accountSLE->getAccountID(sfAccount);
// Simple types
for (auto const& [keyletfunc, _, __] : directAccountKeylets)
{
if (objectExists(std::invoke(keyletfunc, accountID)) && enforce)
return false;
}
{
// NFT pages. ntfpage_min and nftpage_max were already explicitly
// checked above as entries in directAccountKeylets. This uses
// view.succ() to check for any NFT pages in between the two
// endpoints.
Keylet const first = keylet::nftpage_min(accountID);
Keylet const last = keylet::nftpage_max(accountID);
std::optional<uint256> key = view.succ(first.key, last.key.next());
// current page
if (key && objectExists(Keylet{ltNFTOKEN_PAGE, *key}) && enforce)
return false;
}
// Keys directly stored in the AccountRoot object
if (auto const ammKey = accountSLE->at(~sfAMMID))
{
if (objectExists(keylet::amm(*ammKey)) && enforce)
return false;
}
}
return true;
}
//------------------------------------------------------------------------------
void
LedgerEntryTypesMatch::visitEntry(
bool,

View File

@@ -164,6 +164,36 @@ public:
beast::Journal const&);
};
/**
* @brief Invariant: a deleted account must not have any objects left
*
* We iterate all deleted account roots, and ensure that there are no
* objects left that are directly accessible with that account's ID.
*
* There should only be one deleted account, but that's checked by
* AccountRootsNotDeleted. This invariant will handle multiple deleted account
* roots without a problem.
*/
class AccountRootsDeletedClean
{
std::vector<std::shared_ptr<SLE const>> accountsDeleted_;
public:
void
visitEntry(
bool,
std::shared_ptr<SLE const> const&,
std::shared_ptr<SLE const> const&);
bool
finalize(
STTx const&,
TER const,
XRPAmount const,
ReadView const&,
beast::Journal const&);
};
/**
* @brief Invariant: An account XRP balance must be in XRP and take a value
* between 0 and INITIAL_XRP drops, inclusive.
@@ -423,6 +453,7 @@ public:
using InvariantChecks = std::tuple<
TransactionFeeCheck,
AccountRootsNotDeleted,
AccountRootsDeletedClean,
LedgerEntryTypesMatch,
XRPBalanceChecks,
XRPNotCreated,

View File

@@ -19,7 +19,6 @@
#include <xrpld/app/tx/detail/NFTokenBurn.h>
#include <xrpld/app/tx/detail/NFTokenUtils.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/View.h>
#include <xrpl/protocol/Feature.h>
#include <xrpl/protocol/Protocol.h>

View File

@@ -22,6 +22,7 @@
#include <xrpld/app/tx/detail/NFTokenUtils.h>
#include <xrpld/app/tx/detail/Transactor.h>
#include <xrpl/protocol/nft.h>
namespace ripple {

View File

@@ -18,7 +18,7 @@
//==============================================================================
#include <xrpld/app/tx/detail/NFTokenUtils.h>
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/Dir.h>
#include <xrpld/ledger/View.h>
#include <xrpl/basics/algorithm.h>
#include <xrpl/protocol/Feature.h>
@@ -191,6 +191,7 @@ getPageForToken(
: carr[0].getFieldH256(sfNFTokenID);
auto np = std::make_shared<SLE>(keylet::nftpage(base, tokenIDForNewPage));
assert(np->key() > base.key);
np->setFieldArray(sfNFTokens, narr);
np->setFieldH256(sfNextPageMin, cp->key());

View File

@@ -416,11 +416,11 @@ SetSignerList::writeSignersToSLE(
STArray toLedger(signers_.size());
for (auto const& entry : signers_)
{
toLedger.emplace_back(sfSignerEntry);
toLedger.push_back(STObject::makeInnerObject(sfSignerEntry));
STObject& obj = toLedger.back();
obj.reserve(2);
obj.setAccountID(sfAccount, entry.account);
obj.setFieldU16(sfSignerWeight, entry.weight);
obj[sfAccount] = entry.account;
obj[sfSignerWeight] = entry.weight;
// This is a defensive check to make absolutely sure we will never write
// a tag into the ledger while featureExpandedSignerList is not enabled

View File

@@ -30,7 +30,7 @@ Expected<std::vector<SignerEntries::SignerEntry>, NotTEC>
SignerEntries::deserialize(
STObject const& obj,
beast::Journal journal,
std::string const& annotation)
std::string_view annotation)
{
std::pair<std::vector<SignerEntry>, NotTEC> s;

View File

@@ -27,18 +27,27 @@
#include <xrpl/protocol/STTx.h> // STTx::maxMultiSigners
#include <xrpl/protocol/TER.h> // temMALFORMED
#include <xrpl/protocol/UintTypes.h> // AccountID
#include <optional>
#include <string_view>
namespace ripple {
// Forward declarations
class STObject;
// Support for SignerEntries that is needed by a few Transactors
// Support for SignerEntries that is needed by a few Transactors.
//
// SignerEntries is represented as a std::vector<SignerEntries::SignerEntry>.
// There is no direct constructor for SignerEntries.
//
// o A std::vector<SignerEntries::SignerEntry> is a SignerEntries.
// o More commonly, SignerEntries are extracted from an STObject by
// calling SignerEntries::deserialize().
class SignerEntries
{
public:
explicit SignerEntries() = default;
explicit SignerEntries() = delete;
struct SignerEntry
{
@@ -69,11 +78,15 @@ public:
};
// Deserialize a SignerEntries array from the network or from the ledger.
//
// obj Contains a SignerEntries field that is an STArray.
// journal For reporting error conditions.
// annotation Source of SignerEntries, like "ledger" or "transaction".
static Expected<std::vector<SignerEntry>, NotTEC>
deserialize(
STObject const& obj,
beast::Journal journal,
std::string const& annotation);
std::string_view annotation);
};
} // namespace ripple

View File

@@ -25,6 +25,18 @@
namespace ripple {
/** A class that simplifies iterating ledger directory pages
The Dir class provides a forward iterator for walking through
the uint256 values contained in ledger directories.
The Dir class also allows accelerated directory walking by
stepping directly from one page to the next using the next_page()
member function.
As of July 2024, the Dir class is only being used with NFTokenOffer
directories and for unit tests.
*/
class Dir
{
private:

View File

@@ -17,7 +17,7 @@
*/
//==============================================================================
#include <xrpld/ledger/Directory.h>
#include <xrpld/ledger/Dir.h>
namespace ripple {

View File

@@ -63,7 +63,7 @@ decodeCTID(const T ctid) noexcept
if (ctidString.length() != 16)
return {};
if (!boost::regex_match(ctidString, boost::regex("^[0-9A-F]+$")))
if (!boost::regex_match(ctidString, boost::regex("^[0-9A-Fa-f]+$")))
return {};
ctidValue = std::stoull(ctidString, nullptr, 16);

View File

@@ -986,6 +986,22 @@ chooseLedgerEntryType(Json::Value const& params)
return result;
}
bool
isAccountObjectsValidType(LedgerEntryType const& type)
{
switch (type)
{
case LedgerEntryType::ltAMENDMENTS:
case LedgerEntryType::ltDIR_NODE:
case LedgerEntryType::ltFEE_SETTINGS:
case LedgerEntryType::ltLEDGER_HASHES:
case LedgerEntryType::ltNEGATIVE_UNL:
return false;
default:
return true;
}
}
beast::SemanticVersion const firstVersion("1.0.0");
beast::SemanticVersion const goodVersion("1.0.0");
beast::SemanticVersion const lastVersion("1.0.0");

View File

@@ -232,6 +232,15 @@ setVersion(Object& parent, unsigned int apiVersion, bool betaEnabled)
std::pair<RPC::Status, LedgerEntryType>
chooseLedgerEntryType(Json::Value const& params);
/**
* Check if the type is a valid filtering type for account_objects method
*
* Since Amendments, DirectoryNode, FeeSettings, LedgerHashes can not be
* owned by an account, this function will return false in these situations.
*/
bool
isAccountObjectsValidType(LedgerEntryType const& type);
/**
* Retrieve the api version number from the json value
*

View File

@@ -1051,7 +1051,7 @@ transactionSignFor(
auto& sttx = preprocResult.second;
{
// Make the signer object that we'll inject.
STObject signer(sfSigner);
STObject signer = STObject::makeInnerObject(sfSigner);
signer[sfAccount] = *signerAccountID;
signer.setFieldVL(sfTxnSignature, signForParams.getSignature());
signer.setFieldVL(

View File

@@ -71,6 +71,9 @@ doAccountChannels(RPC::JsonContext& context)
if (!params.isMember(jss::account))
return RPC::missing_field_error(jss::account);
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (!ledger)

View File

@@ -33,19 +33,29 @@ doAccountCurrencies(RPC::JsonContext& context)
{
auto& params = context.params;
if (!(params.isMember(jss::account) || params.isMember(jss::ident)))
return RPC::missing_field_error(jss::account);
std::string strIdent;
if (params.isMember(jss::account))
{
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
strIdent = params[jss::account].asString();
}
else if (params.isMember(jss::ident))
{
if (!params[jss::ident].isString())
return RPC::invalid_field_error(jss::ident);
strIdent = params[jss::ident].asString();
}
// Get the current ledger
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (!ledger)
return result;
if (!(params.isMember(jss::account) || params.isMember(jss::ident)))
return RPC::missing_field_error(jss::account);
std::string const strIdent(
params.isMember(jss::account) ? params[jss::account].asString()
: params[jss::ident].asString());
// Get info on account.
auto id = parseBase58<AccountID>(strIdent);
if (!id)

View File

@@ -26,6 +26,7 @@
#include <xrpl/json/json_value.h>
#include <xrpl/protocol/ErrorCodes.h>
#include <xrpl/protocol/Indexes.h>
#include <xrpl/protocol/RPCErr.h>
#include <xrpl/protocol/UintTypes.h>
#include <xrpl/protocol/jss.h>
#include <grpc/status.h>
@@ -53,9 +54,17 @@ doAccountInfo(RPC::JsonContext& context)
std::string strIdent;
if (params.isMember(jss::account))
{
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
strIdent = params[jss::account].asString();
}
else if (params.isMember(jss::ident))
{
if (!params[jss::ident].isString())
return RPC::invalid_field_error(jss::ident);
strIdent = params[jss::ident].asString();
}
else
return RPC::missing_field_error(jss::account);

View File

@@ -80,6 +80,9 @@ doAccountLines(RPC::JsonContext& context)
if (!params.isMember(jss::account))
return RPC::missing_field_error(jss::account);
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (!ledger)

View File

@@ -54,17 +54,19 @@ doAccountNFTs(RPC::JsonContext& context)
if (!params.isMember(jss::account))
return RPC::missing_field_error(jss::account);
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (ledger == nullptr)
return result;
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
auto id = parseBase58<AccountID>(params[jss::account].asString());
if (!id)
{
RPC::inject_error(rpcACT_MALFORMED, result);
return result;
return rpcError(rpcACT_MALFORMED);
}
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (ledger == nullptr)
return result;
auto const accountID{id.value()};
if (!ledger->exists(keylet::account(accountID)))
@@ -75,8 +77,9 @@ doAccountNFTs(RPC::JsonContext& context)
return *err;
uint256 marker;
bool const markerSet = params.isMember(jss::marker);
if (params.isMember(jss::marker))
if (markerSet)
{
auto const& m = params[jss::marker];
if (!m.isString())
@@ -98,6 +101,7 @@ doAccountNFTs(RPC::JsonContext& context)
// Continue iteration from the current page:
bool pastMarker = marker.isZero();
bool markerFound = false;
uint256 const maskedMarker = marker & nft::pageMask;
while (cp)
{
@@ -119,12 +123,23 @@ doAccountNFTs(RPC::JsonContext& context)
uint256 const nftokenID = o[sfNFTokenID];
uint256 const maskedNftokenID = nftokenID & nft::pageMask;
if (!pastMarker && maskedNftokenID < maskedMarker)
continue;
if (!pastMarker)
{
if (maskedNftokenID < maskedMarker)
continue;
if (!pastMarker && maskedNftokenID == maskedMarker &&
nftokenID <= marker)
continue;
if (maskedNftokenID == maskedMarker && nftokenID < marker)
continue;
if (nftokenID == marker)
{
markerFound = true;
continue;
}
}
if (markerSet && !markerFound)
return RPC::invalid_field_error(jss::marker);
pastMarker = true;
@@ -155,6 +170,9 @@ doAccountNFTs(RPC::JsonContext& context)
cp = nullptr;
}
if (markerSet && !markerFound)
return RPC::invalid_field_error(jss::marker);
result[jss::account] = toBase58(accountID);
context.loadType = Resource::feeMediumBurdenRPC;
return result;
@@ -167,6 +185,9 @@ doAccountObjects(RPC::JsonContext& context)
if (!params.isMember(jss::account))
return RPC::missing_field_error(jss::account);
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (ledger == nullptr)
@@ -220,6 +241,9 @@ doAccountObjects(RPC::JsonContext& context)
{
auto [rpcStatus, type] = RPC::chooseLedgerEntryType(params);
if (!RPC::isAccountObjectsValidType(type))
return RPC::invalid_field_error(jss::type);
if (rpcStatus)
{
result.clear();

View File

@@ -60,6 +60,9 @@ doAccountOffers(RPC::JsonContext& context)
if (!params.isMember(jss::account))
return RPC::missing_field_error(jss::account);
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
std::shared_ptr<ReadView const> ledger;
auto result = RPC::lookupLedger(ledger, context);
if (!ledger)
@@ -84,7 +87,7 @@ doAccountOffers(RPC::JsonContext& context)
return *err;
if (limit == 0)
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::limit);
Json::Value& jsonOffers(result[jss::offers] = Json::arrayValue);
std::vector<std::shared_ptr<SLE const>> offers;
@@ -101,13 +104,13 @@ doAccountOffers(RPC::JsonContext& context)
std::stringstream marker(params[jss::marker].asString());
std::string value;
if (!std::getline(marker, value, ','))
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::marker);
if (!startAfter.parseHex(value))
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::marker);
if (!std::getline(marker, value, ','))
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::marker);
try
{
@@ -115,7 +118,7 @@ doAccountOffers(RPC::JsonContext& context)
}
catch (boost::bad_lexical_cast&)
{
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::marker);
}
// We then must check if the object pointed to by the marker is actually

View File

@@ -426,12 +426,12 @@ doAccountTxJson(RPC::JsonContext& context)
if (context.apiVersion > 1u && params.isMember(jss::binary) &&
!params[jss::binary].isBool())
{
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::binary);
}
if (context.apiVersion > 1u && params.isMember(jss::forward) &&
!params[jss::forward].isBool())
{
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::forward);
}
args.limit = params.isMember(jss::limit) ? params[jss::limit].asUInt() : 0;
@@ -440,7 +440,10 @@ doAccountTxJson(RPC::JsonContext& context)
params.isMember(jss::forward) && params[jss::forward].asBool();
if (!params.isMember(jss::account))
return rpcError(rpcINVALID_PARAMS);
return RPC::missing_field_error(jss::account);
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
auto const account =
parseBase58<AccountID>(params[jss::account].asString());

View File

@@ -38,6 +38,15 @@ doFeature(RPC::JsonContext& context)
if (context.app.config().reporting())
return rpcError(rpcREPORTING_UNSUPPORTED);
if (context.params.isMember(jss::feature))
{
// ensure that the `feature` param is a string
if (!context.params[jss::feature].isString())
{
return rpcError(rpcINVALID_PARAMS);
}
}
bool const isAdmin = context.role == Role::ADMIN;
// Get majority amendment status
majorityAmendments_t majorities;

View File

@@ -66,6 +66,10 @@ doNoRippleCheck(RPC::JsonContext& context)
if (!params.isMember("role"))
return RPC::missing_field_error("role");
if (!params[jss::account].isString())
return RPC::invalid_field_error(jss::account);
bool roleGateway = false;
{
std::string const role = params["role"].asString();
@@ -90,7 +94,7 @@ doNoRippleCheck(RPC::JsonContext& context)
if (context.apiVersion > 1u && params.isMember(jss::transactions) &&
!params[jss::transactions].isBool())
{
return rpcError(rpcINVALID_PARAMS);
return RPC::invalid_field_error(jss::transactions);
}
std::shared_ptr<ReadView const> ledger;

View File

@@ -37,7 +37,7 @@ namespace ripple {
low bits. When dereferencing the pointer, these low "tag" bits are set to
zero. When accessing the tag bits, the high "pointer" bits are set to zero.
The "pointer" part points to to the equivalent to an array of
The "pointer" part points to the equivalent to an array of
`SHAMapHash` followed immediately by an array of
`shared_ptr<SHAMapTreeNode>`. The sizes of these arrays are
determined by the tag. The tag is an index into an array (`boundaries`,

View File

@@ -258,7 +258,7 @@ TaggedPointer::getChildIndex(std::uint16_t isBranch, int i) const
// of a child in the array is the number of non-empty children
// before it. Since `isBranch_` is a bitset of the stored
// children, we simply need to mask out (and set to zero) all
// the bits in `isBranch_` equal to to higher than `i` and count
// the bits in `isBranch_` equal to higher than `i` and count
// the bits.
// mask sets all the bits >=i to zero and all the bits <i to