mirror of
https://github.com/Xahau/xahaud.git
synced 2026-03-06 20:52:22 +00:00
Merges 299 commits from origin/dev including the major rippled 2.4.0 repository restructure (src/ripple/ -> src/xrpld/ + include/xrpl/, Builds/CMake/ -> cmake/, shards removal). Resolved conflicts in 10 files, preserving all partial sync additions: - Coro.ipp: kept our #include <thread> with new paths - SHAMap.cpp: kept LocalValue.h, JobQueue.h, chrono includes - NodeFamily.cpp: kept enhanced logging and addPriorityNode call - InboundLedgers.cpp: kept RangeSet.h include with new paths - NetworkOPs.cpp: updated beginConsensus to include clog param - RPCHelpers.cpp: kept InboundLedgers.h, LocalValue.h includes and commented-out sync validation (updated for removed reporting()) - Handler.cpp: added submit_and_wait to new alphabetical handler list - Handler.h: kept @@markers, updated to NO_CONDITION check - SHAMapInnerNode.h: kept @@markers, added getBranchCount() - ordering.txt: accepted upstream restructured dependency graph
Basics
Utility functions and classes.
ripple/basic should contain no dependencies on other modules.
Choosing a rippled container.
-
std::vector- For ordered containers with most insertions or erases at the end.
-
std::deque- For ordered containers with most insertions or erases at the start or end.
-
std::list- For ordered containers with inserts and erases to the middle.
- For containers with iterators stable over insert and erase.
- Generally slower and bigger than
std::vectororstd::dequeexcept for those cases.
-
std::set- For sorted containers.
-
ripple::hash_set- Where inserts and contains need to be O(1).
- For "small" sets,
std::setmight be faster and smaller.
-
ripple::hardened_hash_set- For data sets where the key could be manipulated by an attacker in an attempt to mount an algorithmic complexity attack: see http://en.wikipedia.org/wiki/Algorithmic_complexity_attack
The following container is deprecated
std::unordered_set- Use
ripple::hash_setinstead, which uses a better hashing algorithm. - Or use
ripple::hardened_hash_setto prevent algorithmic complexity attacks.