Compare commits

..

18 Commits

Author SHA1 Message Date
Denis Angell
1d101b7130 Use the Conan package manager
Use the Conan package manager (#4367)

Introduces a conanfile.py (and a Conan recipe for RocksDB) to enable building the package with Conan, choosing more recent default versions of dependencies. It removes almost all of the CMake build files related to dependencies, and the configurations for Travis CI and GitLab CI. A new set of cross-platform build instructions are written in BUILD.md.

Includes example GitHub Actions workflow for each of Linux, macOS, Windows.

* Test on macos-12

We use the <concepts> library which was not added to Apple Clang until
version 13.1.6. The default Clang on macos-11 (the sometimes current
version of macos-latest) is 13.0.0, and the default Clang on macos-12 is
14.0.0.

Closes #4223.

dump

remove to string

fix assert

[do not merge] ignore build error

[do not merge] ignore build error

Revert "[do not merge] ignore build error"

This reverts commit 576e05ac1628e0e934920350f471963aa2ce49c3.

Revert "[do not merge] ignore build error"

This reverts commit 29975ec19a045488e9a1903e0ed77639563118d5.
2024-07-08 12:28:30 +02:00
Denis Angell
84eee588d3 Merge branch 'dev' into strict-builds 2024-06-05 12:17:36 +02:00
Denis Angell
03569dbb11 add correct imports 2024-05-14 12:03:07 +02:00
Denis Angell
cb77121e20 Merge branch 'dev' into strict-builds 2024-05-06 12:14:05 +02:00
Denis Angell
c55a97c51a Update STValidation.h 2024-03-25 09:27:17 +01:00
Denis Angell
fc0be9c416 Merge branch 'dev' into strict-builds 2024-03-25 09:13:55 +01:00
Denis Angell
aaccf9b5b2 revert remove json log 2024-03-25 09:12:55 +01:00
Denis Angell
17af075665 revert remove json log 2024-03-25 09:12:33 +01:00
Denis Angell
e6b362c832 Merge branch 'dev' into strict-builds 2024-03-22 13:38:51 +01:00
Denis Angell
5b2b915955 clang-format 2024-03-18 14:57:35 +01:00
Denis Angell
e801ead39d fix more headers 2024-03-18 14:52:00 +01:00
Denis Angell
78a96dd633 add missing headers 2024-03-18 13:17:10 +01:00
Denis Angell
b3984c166d misc 2024-03-18 13:13:09 +01:00
Denis Angell
945f737706 Update ServerInfo.cpp 2024-03-18 13:09:01 +01:00
Denis Angell
8360ff8bc2 more linting 2024-03-18 13:03:05 +01:00
Denis Angell
c1274d2a12 Update ServerDefinitions_test.cpp 2024-03-18 13:02:03 +01:00
Denis Angell
4aa79b6100 Update ClaimReward_test.cpp 2024-03-18 12:59:05 +01:00
Denis Angell
6a4c563ced fix
invalid operands to binary expression ('basic_ostream<char, std::char_traits<char>>' and 'Json::Value')
2024-03-18 12:38:25 +01:00
352 changed files with 10829 additions and 30780 deletions

View File

@@ -1,885 +0,0 @@
#!/usr/bin/env python3
"""
Persistent Gitea for Conan on Self-Hosted GA Runner
- Localhost only (127.0.0.1) for security
- Persistent volumes survive between workflows
- Idempotent - safe to run multiple times
- Reuses existing container if already running
- Uses pre-baked app.ini to bypass web setup wizard
What This Script Uses Conan For
--------------------------------
This script uses Conan only for testing and verification:
- Optionally configures host's conan client (if available)
- Runs container-based tests to verify the repository works
- Tests upload/download of a sample package (zlib) in a container
- Verifies authentication and package management work correctly
- Does NOT build or manage your actual project dependencies
The test command runs in a Docker container on the same network as Gitea,
exactly mimicking how your GitHub Actions workflows will use it.
Docker Networking
-----------------
Gitea is configured with ROOT_URL using the container name for consistency.
A Docker network (default: conan-net) is used for container-to-container communication.
Access methods:
1. From the host machine:
- The host uses http://localhost:3000 (port mapping)
- Host's Conan configuration uses localhost
2. From Docker containers (tests and CI/CD):
- Containers use http://gitea-conan-persistent:3000
- Containers must be on the same network (default: conan-net)
- The test command automatically handles network setup
The script automatically:
- Creates the Docker network if needed
- Connects Gitea to the network
- Runs tests in containers on the same network
Example in GitHub Actions workflow:
docker network create conan-net
docker network connect conan-net gitea-conan-persistent
docker run --network conan-net <your-build-container> bash -c "
conan remote add gitea-local http://gitea-conan-persistent:3000/api/packages/conan/conan
conan user -p conan-pass-2024 -r gitea-local conan
conan config set general.revisions_enabled=1 # Required for Conan v1
"
"""
import argparse
import logging
import os
import queue
import shutil
import socket
import subprocess
import sys
import threading
import time
from typing import Optional
class DockerLogStreamer(threading.Thread):
"""Background thread to stream docker logs -f and pass lines into a queue"""
def __init__(self, container_name: str, log_queue: queue.Queue):
super().__init__(name=f"DockerLogStreamer-{container_name}")
self.container = container_name
self.log_queue = log_queue
self._stop_event = threading.Event()
self.proc: Optional[subprocess.Popen] = None
self.daemon = True # so it won't block interpreter exit if something goes wrong
def run(self):
try:
# Follow logs, capture both stdout and stderr
self.proc = subprocess.Popen(
["docker", "logs", "-f", self.container],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
text=True,
bufsize=1,
universal_newlines=True,
)
if not self.proc.stdout:
return
for line in self.proc.stdout:
if line is None:
break
# Ensure exact line fidelity
self.log_queue.put(line.rstrip("\n"))
if self._stop_event.is_set():
break
except Exception as e:
# Put an error marker so consumer can see
self.log_queue.put(f"[STREAMER_ERROR] {e}")
finally:
try:
if self.proc and self.proc.poll() is None:
# Do not kill abruptly unless asked to stop
pass
except Exception:
pass
def stop(self, timeout: float = 5.0):
self._stop_event.set()
try:
if self.proc and self.proc.poll() is None:
# Politely terminate docker logs
self.proc.terminate()
try:
self.proc.wait(timeout=timeout)
except Exception:
self.proc.kill()
except Exception:
pass
class PersistentGiteaConan:
def __init__(self, debug: bool = False, verbose: bool = False):
# Configurable via environment variables for CI flexibility
self.container = os.getenv("GITEA_CONTAINER_NAME", "gitea-conan-persistent")
self.port = int(os.getenv("GITEA_PORT", "3000"))
self.user = os.getenv("GITEA_USER", "conan")
self.passwd = os.getenv("GITEA_PASSWORD", "conan-pass-2024") # do not print this in logs
self.email = os.getenv("GITEA_EMAIL", "conan@localhost")
# Persistent data location on the runner
self.data_dir = os.getenv("GITEA_DATA_DIR", "/opt/gitea")
# Docker network for container communication
self.network = os.getenv("GITEA_NETWORK", "conan-net")
# Behavior flags
self.print_credentials = os.getenv("GITEA_PRINT_CREDENTIALS", "0") == "1"
self.startup_timeout = int(os.getenv("GITEA_STARTUP_TIMEOUT", "120"))
# Logging and docker log streaming infrastructure
self._setup_logging(debug=debug, verbose=verbose)
self.log_queue: queue.Queue[str] = queue.Queue()
self.log_streamer: Optional[DockerLogStreamer] = None
# Conan execution context cache
self._conan_prefix: Optional[str] = None # '' for direct, full sudo+shell for delegated; None if unavailable
# Track sensitive values that should be masked in logs
self._sensitive_values: set = {self.passwd} # Start with password
def _setup_logging(self, debug: bool, verbose: bool):
# Determine level: debug > verbose > default WARNING
if debug:
level = logging.DEBUG
elif verbose:
level = logging.INFO
else:
level = logging.WARNING
logging.basicConfig(level=level, format='%(asctime)s - %(levelname)s - %(filename)s:%(lineno)d - %(message)s')
self.logger = logging.getLogger(__name__)
# Be slightly quieter for noisy libs
logging.getLogger('urllib3').setLevel(logging.WARNING)
def _mask_sensitive(self, text: str) -> str:
"""Mask any sensitive values in text for safe logging"""
if not text:
return text
masked = text
for sensitive in self._sensitive_values:
if sensitive and sensitive in masked:
masked = masked.replace(sensitive, "***REDACTED***")
return masked
def run(self, cmd, check=True, env=None, sensitive=False):
"""Run command with minimal output
Args:
cmd: Command to run
check: Raise exception on non-zero exit
env: Environment variables
sensitive: If True, command and output are completely hidden
"""
run_env = os.environ.copy()
if env:
run_env.update(env)
# Log command (masked or hidden based on sensitivity)
if sensitive:
self.logger.debug("EXEC: [sensitive command hidden]")
else:
self.logger.debug(f"EXEC: {self._mask_sensitive(cmd)}")
result = subprocess.run(cmd, shell=True, capture_output=True, text=True, env=run_env)
# Log output (masked or hidden based on sensitivity)
if not sensitive:
if result.stdout:
self.logger.debug(f"STDOUT: {self._mask_sensitive(result.stdout.strip())}"[:1000])
if result.stderr:
self.logger.debug(f"STDERR: {self._mask_sensitive(result.stderr.strip())}"[:1000])
if result.returncode != 0 and check:
if sensitive:
self.logger.error(f"Command failed ({result.returncode})")
raise RuntimeError("Command failed (details hidden for security)")
else:
self.logger.error(f"Command failed ({result.returncode}) for: {self._mask_sensitive(cmd)}")
raise RuntimeError(f"Command failed: {self._mask_sensitive(result.stderr)}")
return result
def is_running(self):
"""Check if container is already running"""
result = self.run(f"docker ps -q -f name={self.container}", check=False)
return bool(result.stdout.strip())
def container_exists(self):
"""Check if container exists (running or stopped)"""
result = self.run(f"docker ps -aq -f name={self.container}", check=False)
return bool(result.stdout.strip())
# ---------- Helpers & Preflight Checks ----------
def _check_docker(self):
if not shutil.which("docker"):
raise RuntimeError(
"Docker is not installed or not in PATH. Please install Docker and ensure the daemon is running.")
# Check daemon access
info = subprocess.run("docker info", shell=True, capture_output=True, text=True)
if info.returncode != 0:
raise RuntimeError(
"Docker daemon not accessible. Ensure the Docker service is running and the current user has permission to use Docker.")
def _is_port_in_use(self, host, port):
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
s.settimeout(0.5)
return s.connect_ex((host, port)) == 0
def _setup_directories(self):
"""Create directory structure with proper ownership"""
gitea_data = os.path.join(self.data_dir, "gitea")
gitea_conf = os.path.join(gitea_data, "gitea", "conf")
# Create all directories
os.makedirs(gitea_conf, exist_ok=True)
self.logger.info(f"📁 Created directory structure: {self.data_dir}")
# Set ownership recursively to git user (UID 1000)
for root, dirs, files in os.walk(self.data_dir):
os.chown(root, 1000, 1000)
for d in dirs:
os.chown(os.path.join(root, d), 1000, 1000)
for f in files:
os.chown(os.path.join(root, f), 1000, 1000)
def _preflight(self):
self.logger.info("🔍 Running preflight checks...")
self._check_docker()
# Port check only if our container is not already running
if not self.is_running():
if self._is_port_in_use("127.0.0.1", self.port):
raise RuntimeError(f"Port {self.port} on 127.0.0.1 is already in use. Cannot bind Gitea.")
self.logger.info("✓ Preflight checks passed")
def _generate_secret(self, secret_type):
"""Generate a secret using Gitea's built-in generator"""
cmd = f"docker run --rm gitea/gitea:latest gitea generate secret {secret_type}"
result = self.run(cmd, sensitive=True) # Don't log the output
secret = result.stdout.strip()
if secret:
self._sensitive_values.add(secret) # Track this secret for masking
self.logger.debug(f"Generated {secret_type} successfully")
return secret
def _create_app_ini(self, gitea_conf_dir):
"""Create a pre-configured app.ini file"""
app_ini_path = os.path.join(gitea_conf_dir, "app.ini")
# Check if app.ini already exists (from previous run)
if os.path.exists(app_ini_path):
self.logger.info("✓ Using existing app.ini configuration")
# Minimal migration: ensure HTTP_ADDR allows inbound connections from host via Docker mapping
try:
with open(app_ini_path, 'r+', encoding='utf-8') as f:
content = f.read()
updated = False
if "HTTP_ADDR = 127.0.0.1" in content:
content = content.replace("HTTP_ADDR = 127.0.0.1", "HTTP_ADDR = 0.0.0.0")
updated = True
if updated:
f.seek(0)
f.write(content)
f.truncate()
self.logger.info("🔁 Updated existing app.ini to bind on 0.0.0.0 for container reachability")
except Exception as e:
self.logger.warning(f"⚠️ Could not update existing app.ini automatically: {e}")
return
self.logger.info("🔑 Generating security secrets...")
secret_key = self._generate_secret("SECRET_KEY")
internal_token = self._generate_secret("INTERNAL_TOKEN")
self.logger.info("📝 Creating app.ini configuration...")
app_ini_content = f"""APP_NAME = Conan Package Registry
RUN_USER = git
RUN_MODE = prod
[server]
ROOT_URL = http://gitea-conan-persistent:{self.port}/
HTTP_ADDR = 0.0.0.0
HTTP_PORT = 3000
DISABLE_SSH = true
START_SSH_SERVER = false
OFFLINE_MODE = true
DOMAIN = gitea-conan-persistent
LFS_START_SERVER = false
[database]
DB_TYPE = sqlite3
PATH = /data/gitea.db
LOG_SQL = false
[repository]
ROOT = /data/gitea-repositories
DISABLED_REPO_UNITS = repo.issues, repo.pulls, repo.wiki, repo.projects, repo.actions
[security]
INSTALL_LOCK = true
SECRET_KEY = {secret_key}
INTERNAL_TOKEN = {internal_token}
PASSWORD_HASH_ALGO = pbkdf2
MIN_PASSWORD_LENGTH = 8
[service]
DISABLE_REGISTRATION = true
ENABLE_NOTIFY_MAIL = false
REGISTER_EMAIL_CONFIRM = false
ENABLE_CAPTCHA = false
REQUIRE_SIGNIN_VIEW = false
DEFAULT_KEEP_EMAIL_PRIVATE = true
DEFAULT_ALLOW_CREATE_ORGANIZATION = false
DEFAULT_ENABLE_TIMETRACKING = false
[mailer]
ENABLED = false
[session]
PROVIDER = file
[log]
MODE = console
LEVEL = Info
[api]
ENABLE_SWAGGER = false
[packages]
ENABLED = true
[other]
SHOW_FOOTER_VERSION = false
"""
# Write app.ini with restrictive permissions
with open(app_ini_path, 'w') as f:
f.write(app_ini_content)
# Set ownership to UID 1000:1000 (git user in container)
os.chown(app_ini_path, 1000, 1000)
os.chmod(app_ini_path, 0o640)
self.logger.info("✓ Created app.ini with pre-generated secrets")
def setup(self):
"""Setup or verify Gitea is running"""
self.logger.info("🔧 Setting up persistent Gitea for Conan...")
# Preflight
self._preflight()
# Create persistent data directory structure with proper ownership
self._setup_directories()
gitea_data = os.path.join(self.data_dir, "gitea")
gitea_conf = os.path.join(gitea_data, "gitea", "conf")
# Create app.ini BEFORE starting container (for headless setup)
self._create_app_ini(gitea_conf)
# Check if already running
if self.is_running():
self.logger.info("✅ Gitea container already running")
self._verify_health()
self._configure_conan()
return
# Check if container exists but stopped
if self.container_exists():
self.logger.info("🔄 Starting existing container...")
self.run(f"docker start {self.container}")
# Start log streaming for visibility
self._start_log_streaming()
try:
time.sleep(2)
self._verify_health()
self._configure_conan()
finally:
self._stop_log_streaming()
return
# Create new container (first time setup)
self.logger.info("🚀 Creating new Gitea container...")
gitea_data = os.path.join(self.data_dir, "gitea")
# IMPORTANT: Bind to 127.0.0.1 only for security
# With pre-configured app.ini, Gitea starts directly without wizard
docker_cmd = f"""docker run -d \
--name {self.container} \
-p 127.0.0.1:{self.port}:3000 \
-v {gitea_data}:/data \
-v /etc/timezone:/etc/timezone:ro \
-v /etc/localtime:/etc/localtime:ro \
-e USER_UID=1000 \
-e USER_GID=1000 \
--restart unless-stopped \
gitea/gitea:latest"""
self.run(docker_cmd)
# Debug: Check actual port mapping
port_check = self.run(f"docker port {self.container}", check=False)
self.logger.info(f"🔍 Container port mapping: {port_check.stdout.strip()}")
# Start log streaming and wait for Gitea to be ready
self._start_log_streaming()
try:
self._wait_for_startup(self.startup_timeout)
# Create user (idempotent)
self._create_user()
# Configure Conan
self._configure_conan()
finally:
self._stop_log_streaming()
self.logger.info("✅ Persistent Gitea ready for Conan packages!")
self.logger.info(f" URL: http://localhost:{self.port}")
if self.print_credentials:
self.logger.info(f" User: {self.user} / {self.passwd}")
else:
self.logger.info(" Credentials: hidden (set GITEA_PRINT_CREDENTIALS=1 to display)")
self.logger.info(f" Data persisted in: {self.data_dir}")
def _start_log_streaming(self):
# Start background docker log streamer if not running
if self.log_streamer is not None:
self._stop_log_streaming()
self.logger.debug("Starting Docker log streamer...")
self.log_streamer = DockerLogStreamer(self.container, self.log_queue)
self.log_streamer.start()
def _stop_log_streaming(self):
if self.log_streamer is not None:
self.logger.debug("Stopping Docker log streamer...")
try:
self.log_streamer.stop()
self.log_streamer.join(timeout=5)
except Exception:
pass
finally:
self.log_streamer = None
def _wait_for_startup(self, timeout=60):
"""Wait for container to become healthy by consuming the docker log stream"""
self.logger.info(f"⏳ Waiting for Gitea to start (timeout: {timeout}s)...")
start_time = time.time()
server_detected = False
while time.time() - start_time < timeout:
# Drain all available log lines without blocking
drained_any = False
while True:
try:
line = self.log_queue.get_nowait()
drained_any = True
except queue.Empty:
break
if not line.strip():
continue
# Always log raw docker lines at DEBUG level
self.logger.debug(f"DOCKER: {line}")
# Promote important events
l = line
if ("[E]" in l) or ("ERROR" in l) or ("FATAL" in l) or ("panic" in l):
self.logger.error(l)
elif ("WARN" in l) or ("[W]" in l):
self.logger.warning(l)
# Detect startup listening lines
if ("Web server is now listening" in l) or ("Listen:" in l) or ("Starting new Web server" in l):
if not server_detected:
server_detected = True
self.logger.info("✓ Detected web server startup!")
self.logger.info("⏳ Waiting for Gitea to fully initialize...")
# Quick readiness loop
for i in range(10):
time.sleep(1)
if self._is_healthy():
self.logger.info(f"✓ Gitea is ready and responding! (after {i + 1} seconds)")
return
self.logger.warning(
"Server started but health check failed after 10 attempts, continuing to wait...")
# Check if container is still running periodically
container_status = self.run(
f"docker inspect {self.container} --format='{{{{.State.Status}}}}'",
check=False
)
status = (container_status.stdout or "").strip()
if status and status != "running":
# Container stopped or in error state
error_logs = self.run(
f"docker logs --tail 30 {self.container} 2>&1",
check=False
)
self.logger.error(f"Container is in '{status}' state. Last logs:")
for l in (error_logs.stdout or "").split('\n')[-10:]:
if l.strip():
self.logger.error(l)
raise RuntimeError(f"Container failed to start (status: {status})")
# If nothing drained, brief sleep to avoid busy loop
if not drained_any:
time.sleep(0.5)
raise TimeoutError(f"Gitea failed to become ready within {timeout} seconds")
def _is_healthy(self):
"""Check if Gitea is responding"""
# Try a simple HTTP GET first (less verbose)
result = self.run(
f"curl -s -o /dev/null -w '%{{http_code}}' http://localhost:{self.port}/",
check=False
)
code = result.stdout.strip()
# Treat any 2xx/3xx as healthy (e.g., 200 OK, 302/303 redirects)
if code and code[0] in ("2", "3"):
return True
# If it failed, show debug info
if code == "000":
# Only show debug on first failure
if not hasattr(self, '_health_check_debug_shown'):
self._health_check_debug_shown = True
self.logger.info("🔍 Connection issue detected, showing diagnostics:")
# Check what's actually listening
netstat_result = self.run(f"netstat -tln | grep {self.port}", check=False)
self.logger.info(f" Port {self.port} listeners: {netstat_result.stdout.strip() or 'none found'}")
# Check docker port mapping
port_result = self.run(f"docker port {self.container} 3000", check=False)
self.logger.info(f" Docker mapping: {port_result.stdout.strip() or 'not mapped'}")
return False
def _verify_health(self):
"""Verify Gitea is healthy"""
if not self._is_healthy():
raise RuntimeError("Gitea is not responding properly")
self.logger.info("✅ Gitea is healthy")
def _ensure_network(self):
"""Ensure Docker network exists and Gitea is connected to it"""
# Create network if it doesn't exist (idempotent)
self.run(f"docker network create {self.network} 2>/dev/null || true", check=False)
# Connect Gitea to the network if not already connected (idempotent)
self.run(f"docker network connect {self.network} {self.container} 2>/dev/null || true", check=False)
self.logger.debug(f"Ensured {self.container} is connected to {self.network} network")
# ---------- Conan helpers ----------
def _resolve_conan_prefix(self) -> Optional[str]:
"""Determine how to run the 'conan' CLI and cache the decision.
Returns:
'' for direct invocation (conan in PATH),
full sudo+login-shell prefix string for delegated execution, or
None if Conan is not available.
"""
if self._conan_prefix is not None:
return self._conan_prefix
# If running with sudo, try actual user's login shell
if os.geteuid() == 0 and 'SUDO_USER' in os.environ:
actual_user = os.environ['SUDO_USER']
# Discover the user's shell
shell_result = self.run(f"getent passwd {actual_user} | cut -d: -f7", check=False)
user_shell = shell_result.stdout.strip() if shell_result.returncode == 0 and shell_result.stdout.strip() else "/bin/bash"
self.logger.info(f"→ Using {actual_user}'s shell for Conan: {user_shell}")
which_result = self.run(f"sudo -u {actual_user} {user_shell} -l -c 'which conan'", check=False)
if which_result.returncode == 0 and which_result.stdout.strip():
self._conan_prefix = f"sudo -u {actual_user} {user_shell} -l -c"
self.logger.info(f"✓ Found Conan at: {which_result.stdout.strip()}")
return self._conan_prefix
else:
self.logger.warning(f"⚠️ Conan not found in {actual_user}'s PATH.")
self._conan_prefix = None
return self._conan_prefix
else:
# Non-sudo case; check PATH directly
if shutil.which("conan"):
self._conan_prefix = ''
return self._conan_prefix
else:
self.logger.warning("⚠️ Conan CLI not found in PATH.")
self._conan_prefix = None
return self._conan_prefix
def _build_conan_cmd(self, inner_args: str) -> Optional[str]:
"""Build a shell command to run Conan with given inner arguments.
Example: inner_args='remote list' => 'conan remote list' or "sudo -u user shell -l -c 'conan remote list'".
Returns None if Conan is unavailable.
"""
prefix = self._resolve_conan_prefix()
if prefix is None:
return None
if prefix == '':
return f"conan {inner_args}"
# Delegate via sudo+login shell; quote the inner command
return f"{prefix} 'conan {inner_args}'"
def _run_conan(self, inner_args: str, check: bool = False):
"""Run a Conan subcommand using the resolved execution context.
Returns the subprocess.CompletedProcess-like result, or a dummy object with returncode=127 if unavailable.
"""
full_cmd = self._build_conan_cmd(inner_args)
if full_cmd is None:
# Construct a minimal dummy result
class Dummy:
returncode = 127
stdout = ''
stderr = 'conan: not found'
self.logger.error("❌ Conan CLI is not available. Skipping command: conan " + inner_args)
return Dummy()
return self.run(full_cmd, check=check)
def _run_conan_sensitive(self, inner_args: str, check: bool = False):
"""Run a sensitive Conan subcommand (e.g., with passwords) using the resolved execution context."""
full_cmd = self._build_conan_cmd(inner_args)
if full_cmd is None:
class Dummy:
returncode = 127
stdout = ''
stderr = 'conan: not found'
self.logger.error("❌ Conan CLI is not available. Skipping sensitive command")
return Dummy()
return self.run(full_cmd, check=check, sensitive=True)
def _create_user(self):
"""Create Conan user (idempotent)"""
self.logger.info("👤 Setting up admin user...")
# Retry a few times in case DB initialization lags behind
attempts = 5
for i in range(1, attempts + 1):
# First check if user exists
check_cmd = f"docker exec -u 1000:1000 {self.container} gitea admin user list"
result = self.run(check_cmd, check=False)
if result.returncode == 0 and self.user in result.stdout:
self.logger.info(f"✅ User already exists: {self.user}")
return
# Try to create admin user with --admin flag
create_cmd = f"""docker exec -u 1000:1000 {self.container} \
gitea admin user create \
--username {self.user} \
--password {self.passwd} \
--email {self.email} \
--admin \
--must-change-password=false"""
create_res = self.run(create_cmd, check=False, sensitive=True)
if create_res.returncode == 0:
self.logger.info(f"✅ Created admin user: {self.user}")
return
if "already exists" in (create_res.stderr or "").lower() or "already exists" in (
create_res.stdout or "").lower():
self.logger.info(f"✅ User already exists: {self.user}")
return
if i < attempts:
delay = min(2 ** i, 10)
time.sleep(delay)
self.logger.warning(f"⚠️ Could not create user after {attempts} attempts. You may need to create it manually.")
def _configure_conan(self):
"""Configure Conan client (idempotent)"""
self.logger.info("🔧 Configuring Conan client on host...")
# Ensure Conan is available and determine execution context
if self._resolve_conan_prefix() is None:
self.logger.warning("⚠️ Conan CLI not available on host. Skipping client configuration.")
self.logger.info(" Note: Tests will still work using container-based Conan.")
return
# For host-based Conan, we still use localhost since the host can't resolve container names
# Container-based tests will use gitea-conan-persistent directly
conan_url = f"http://localhost:{self.port}/api/packages/{self.user}/conan"
# Remove old remote if exists (ignore errors)
self._run_conan("remote remove gitea-local 2>/dev/null", check=False)
# Add Gitea as remote
self._run_conan(f"remote add gitea-local {conan_url}")
# Authenticate (mark as sensitive even though Conan masks password in process list)
self._run_conan_sensitive(f"user -p {self.passwd} -r gitea-local {self.user}")
# Enable revisions if not already
self._run_conan("config set general.revisions_enabled=1", check=False)
self.logger.info(f"✅ Host Conan configured with remote: gitea-local (via localhost)")
self.logger.info(f" Container tests will use: http://gitea-conan-persistent:{self.port}")
def verify(self):
"""Verify everything is working"""
self.logger.info("🔍 Verifying setup...")
# Check container
if not self.is_running():
self.logger.error("❌ Container not running")
return False
# Check Gitea health
if not self._is_healthy():
self.logger.error("❌ Gitea not responding")
return False
# Check Conan remote
result = self._run_conan("remote list", check=False)
if getattr(result, 'returncode', 1) != 0 or "gitea-local" not in (getattr(result, 'stdout', '') or ''):
self.logger.error("❌ Conan remote not configured")
return False
self.logger.info("✅ All systems operational")
return True
def info(self):
"""Print current status"""
self.logger.info("📊 Gitea Status:")
self.logger.info(f" Container: {self.container}")
self.logger.info(f" Running: {self.is_running()}")
self.logger.info(f" Data dir: {self.data_dir}")
self.logger.info(f" URL: http://localhost:{self.port}")
self.logger.info(f" Conan URL: http://localhost:{self.port}/api/packages/{self.user}/conan")
# Show disk usage
if os.path.exists(self.data_dir):
result = self.run(f"du -sh {self.data_dir}", check=False)
if result.returncode == 0:
size = result.stdout.strip().split('\t')[0]
self.logger.info(f" Disk usage: {size}")
def test(self):
"""Test Conan package upload/download in a container"""
self.logger.info("🧪 Testing Conan with Gitea (container-based test)...")
# Ensure everything is set up
if not self.is_running():
self.logger.error("❌ Gitea not running. Run 'setup' first.")
return False
# Ensure network exists and Gitea is connected
self._ensure_network()
# Test package name
test_package = "zlib/1.3.1"
package_name = test_package.split('/')[0] # Extract just the package name
self.logger.info(f" → Testing with package: {test_package}")
self.logger.info(f" → Running test in container on {self.network} network")
# Run test in a container (same environment as production)
test_cmd = f"""docker run --rm --network {self.network} conanio/gcc11 bash -ec "
# Configure Conan to use Gitea
conan remote add gitea-local http://gitea-conan-persistent:{self.port}/api/packages/{self.user}/conan
conan user -p {self.passwd} -r gitea-local {self.user}
conan config set general.revisions_enabled=1
# Test package upload/download
echo '→ Building {test_package} from source...'
conan install {test_package}@ --build={test_package}
echo '→ Uploading to Gitea...'
conan upload '{package_name}/*' --all -r gitea-local --confirm
echo '→ Removing local copy...'
conan remove '{package_name}/*' -f
echo '→ Downloading from Gitea...'
conan install {test_package}@ -r gitea-local
echo '✅ Container-based test successful!'
" """
result = self.run(test_cmd, check=False, sensitive=False) # Temporarily show output for debugging
if result.returncode == 0:
self.logger.info("✅ Test successful! Package uploaded and downloaded from Gitea.")
return True
else:
self.logger.error("❌ Test failed. Check the output above for details.")
return False
def teardown(self):
"""Stop and remove Gitea container and data"""
self.logger.info("🛑 Tearing down Gitea...")
# Stop and remove container
if self.container_exists():
self.logger.info(f" → Stopping container: {self.container}")
self.run(f"docker stop {self.container}", check=False)
self.logger.info(f" → Removing container: {self.container}")
self.run(f"docker rm {self.container}", check=False)
else:
self.logger.info(" → No container to remove")
# Remove data directory
if os.path.exists(self.data_dir):
self.logger.info(f" → Removing data directory: {self.data_dir}")
shutil.rmtree(self.data_dir)
self.logger.info(" ✓ Data directory removed")
else:
self.logger.info(" → No data directory to remove")
# Remove Docker network if it exists
network_check = self.run(f"docker network ls --format '{{{{.Name}}}}' | grep '^{self.network}$'", check=False)
if network_check.stdout.strip():
self.logger.info(f" → Removing Docker network: {self.network}")
self.run(f"docker network rm {self.network}", check=False)
self.logger.info(" ✓ Network removed")
else:
self.logger.info(f" → No network '{self.network}' to remove")
self.logger.info(" ✅ Teardown complete!")
# For use in GitHub Actions workflows
def main():
parser = argparse.ArgumentParser(description='Persistent Gitea for Conan packages')
parser.add_argument('command', choices=['setup', 'teardown', 'verify', 'info', 'test'], nargs='?', default='setup')
parser.add_argument('--debug', action='store_true', help='Enable debug logging')
parser.add_argument('--verbose', action='store_true', help='Enable verbose logging (info level)')
args = parser.parse_args()
# Temporary logging before instance creation (level will be reconfigured inside class)
temp_level = logging.DEBUG if args.debug else (logging.INFO if args.verbose else logging.WARNING)
logging.basicConfig(level=temp_level, format='%(asctime)s - %(levelname)s - %(filename)s:%(lineno)d - %(message)s')
logger = logging.getLogger(__name__)
# Auto-escalate to sudo for operations that need it
needs_sudo = args.command in ['setup', 'teardown']
if needs_sudo and os.geteuid() != 0:
logger.info("📋 This operation requires sudo privileges. Re-running with sudo...")
os.execvp('sudo', ['sudo'] + sys.argv)
gitea = PersistentGiteaConan(debug=args.debug, verbose=args.verbose)
try:
if args.command == "setup":
gitea.setup()
elif args.command == "verify":
sys.exit(0 if gitea.verify() else 1)
elif args.command == "info":
gitea.info()
elif args.command == "test":
sys.exit(0 if gitea.test() else 1)
elif args.command == "teardown":
gitea.teardown()
except KeyboardInterrupt:
logger.warning("Interrupted by user. Cleaning up...")
try:
gitea._stop_log_streaming()
except Exception:
pass
sys.exit(130)
if __name__ == "__main__":
main()

View File

@@ -1,12 +0,0 @@
#!/bin/bash
# Pre-commit hook that runs the suspicious patterns check on staged files
# Get the repository's root directory
repo_root=$(git rev-parse --show-toplevel)
# Run the suspicious patterns script in pre-commit mode
"$repo_root/suspicious_patterns.sh" --pre-commit
# Exit with the same code as the script
exit $?

View File

@@ -1,4 +0,0 @@
#!/bin/bash
echo "Configuring git to use .githooks directory..."
git config core.hooksPath .githooks

View File

@@ -1,31 +0,0 @@
name: 'Configure ccache'
description: 'Sets up ccache with consistent configuration'
inputs:
max_size:
description: 'Maximum cache size'
required: false
default: '2G'
hash_dir:
description: 'Whether to include directory paths in hash'
required: false
default: 'true'
compiler_check:
description: 'How to check compiler for changes'
required: false
default: 'content'
runs:
using: 'composite'
steps:
- name: Configure ccache
shell: bash
run: |
mkdir -p ~/.ccache
export CONF_PATH="${CCACHE_CONFIGPATH:-${CCACHE_DIR:-$HOME/.ccache}/ccache.conf}"
mkdir -p $(dirname "$CONF_PATH")
echo "max_size = ${{ inputs.max_size }}" > "$CONF_PATH"
echo "hash_dir = ${{ inputs.hash_dir }}" >> "$CONF_PATH"
echo "compiler_check = ${{ inputs.compiler_check }}" >> "$CONF_PATH"
ccache -p # Print config for verification
ccache -z # Zero statistics before the build

View File

@@ -1,108 +0,0 @@
name: build
description: 'Builds the project with ccache integration'
inputs:
generator:
description: 'CMake generator to use'
required: true
configuration:
description: 'Build configuration (Debug, Release, etc.)'
required: true
build_dir:
description: 'Directory to build in'
required: false
default: '.build'
cc:
description: 'C compiler to use'
required: false
default: ''
cxx:
description: 'C++ compiler to use'
required: false
default: ''
compiler-id:
description: 'Unique identifier for compiler/version combination used for cache keys'
required: false
default: ''
cache_version:
description: 'Cache version for invalidation'
required: false
default: '1'
ccache_enabled:
description: 'Whether to use ccache'
required: false
default: 'true'
main_branch:
description: 'Main branch name for restore keys'
required: false
default: 'dev'
runs:
using: 'composite'
steps:
- name: Generate safe branch name
if: inputs.ccache_enabled == 'true'
id: safe-branch
shell: bash
run: |
SAFE_BRANCH=$(echo "${{ github.ref_name }}" | tr -c 'a-zA-Z0-9_.-' '-')
echo "name=${SAFE_BRANCH}" >> $GITHUB_OUTPUT
- name: Restore ccache directory
if: inputs.ccache_enabled == 'true'
id: ccache-restore
uses: actions/cache/restore@v4
with:
path: ~/.ccache
key: ${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-${{ steps.safe-branch.outputs.name }}
restore-keys: |
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-${{ inputs.main_branch }}
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ inputs.configuration }}-
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-
${{ runner.os }}-ccache-v${{ inputs.cache_version }}-
- name: Configure project
shell: bash
run: |
mkdir -p ${{ inputs.build_dir }}
cd ${{ inputs.build_dir }}
# Set compiler environment variables if provided
if [ -n "${{ inputs.cc }}" ]; then
export CC="${{ inputs.cc }}"
fi
if [ -n "${{ inputs.cxx }}" ]; then
export CXX="${{ inputs.cxx }}"
fi
# Configure ccache launcher args
CCACHE_ARGS=""
if [ "${{ inputs.ccache_enabled }}" = "true" ]; then
CCACHE_ARGS="-DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache"
fi
# Run CMake configure
cmake .. \
-G "${{ inputs.generator }}" \
$CCACHE_ARGS \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }}
- name: Build project
shell: bash
run: |
cd ${{ inputs.build_dir }}
cmake --build . --config ${{ inputs.configuration }} --parallel $(nproc)
- name: Show ccache statistics
if: inputs.ccache_enabled == 'true'
shell: bash
run: ccache -s
- name: Save ccache directory
if: inputs.ccache_enabled == 'true'
uses: actions/cache/save@v4
with:
path: ~/.ccache
key: ${{ steps.ccache-restore.outputs.cache-primary-key }}

View File

@@ -1,86 +0,0 @@
name: dependencies
description: 'Installs build dependencies with caching'
inputs:
configuration:
description: 'Build configuration (Debug, Release, etc.)'
required: true
build_dir:
description: 'Directory to build dependencies in'
required: false
default: '.build'
compiler-id:
description: 'Unique identifier for compiler/version combination used for cache keys'
required: false
default: ''
cache_version:
description: 'Cache version for invalidation'
required: false
default: '1'
cache_enabled:
description: 'Whether to use caching'
required: false
default: 'true'
main_branch:
description: 'Main branch name for restore keys'
required: false
default: 'dev'
outputs:
cache-hit:
description: 'Whether there was a cache hit'
value: ${{ steps.cache-restore-conan.outputs.cache-hit }}
runs:
using: 'composite'
steps:
- name: Generate safe branch name
if: inputs.cache_enabled == 'true'
id: safe-branch
shell: bash
run: |
SAFE_BRANCH=$(echo "${{ github.ref_name }}" | tr -c 'a-zA-Z0-9_.-' '-')
echo "name=${SAFE_BRANCH}" >> $GITHUB_OUTPUT
- name: Restore Conan cache
if: inputs.cache_enabled == 'true'
id: cache-restore-conan
uses: actions/cache/restore@v4
with:
path: |
~/.conan
~/.conan2
key: ${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ hashFiles('**/conanfile.txt', '**/conanfile.py') }}-${{ inputs.configuration }}
restore-keys: |
${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-${{ hashFiles('**/conanfile.txt', '**/conanfile.py') }}-
${{ runner.os }}-conan-v${{ inputs.cache_version }}-${{ inputs.compiler-id }}-
${{ runner.os }}-conan-v${{ inputs.cache_version }}-
- name: Export custom recipes
shell: bash
run: |
conan export external/snappy snappy/1.1.9@
conan export external/soci soci/4.0.3@
- name: Install dependencies
shell: bash
run: |
# Create build directory
mkdir -p ${{ inputs.build_dir }}
cd ${{ inputs.build_dir }}
# Install dependencies using conan
conan install \
--output-folder . \
--build missing \
--settings build_type=${{ inputs.configuration }} \
..
- name: Save Conan cache
if: inputs.cache_enabled == 'true' && steps.cache-restore-conan.outputs.cache-hit != 'true'
uses: actions/cache/save@v4
with:
path: |
~/.conan
~/.conan2
key: ${{ steps.cache-restore-conan.outputs.cache-primary-key }}

View File

@@ -2,159 +2,37 @@ name: Build using Docker
on:
push:
branches: ["dev", "candidate", "release", "ci-experiments"]
branches: [ "dev", "candidate", "release", "jshooks" ]
pull_request:
branches: ["dev", "candidate", "release", "ci-experiments"]
branches: [ "dev", "candidate", "release", "jshooks" ]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
DEBUG_BUILD_CONTAINERS_AFTER_CLEANUP: 1
group: ${{ github.workflow }}
cancel-in-progress: false
jobs:
checkout:
runs-on: [self-hosted, vanity]
outputs:
checkout_path: ${{ steps.vars.outputs.checkout_path }}
steps:
- name: Prepare checkout path
id: vars
run: |
SAFE_BRANCH=$(echo "${{ github.ref_name }}" | sed -e 's/[^a-zA-Z0-9._-]/-/g')
CHECKOUT_PATH="${SAFE_BRANCH}-${{ github.sha }}"
echo "checkout_path=${CHECKOUT_PATH}" >> "$GITHUB_OUTPUT"
- uses: actions/checkout@v4
with:
path: ${{ steps.vars.outputs.checkout_path }}
clean: true
fetch-depth: 2 # Only get the last 2 commits, to avoid fetching all history
- uses: actions/checkout@v3
with:
clean: false
checkpatterns:
runs-on: [self-hosted, vanity]
needs: checkout
defaults:
run:
working-directory: ${{ needs.checkout.outputs.checkout_path }}
steps:
- name: Check for suspicious patterns
run: /bin/bash suspicious_patterns.sh
- name: Check for suspicious patterns
run: /bin/bash suspicious_patterns.sh
build:
runs-on: [self-hosted, vanity]
needs: [checkpatterns, checkout]
defaults:
run:
working-directory: ${{ needs.checkout.outputs.checkout_path }}
needs: checkpatterns
steps:
- name: Install Python & pipx
run: |
sudo apt update && sudo apt install -y python3 python3-pip pipx python-is-python3
- name: Install Conan
run: |
pipx install "conan<2.0"
/root/.local/bin/conan --version # PATH doesn't seem to be set correctly
- name: Setup network and Gitea
run: |
# Create network for container communication (idempotent)
docker network create conan-net 2>/dev/null || true
# Setup Gitea
PATH="/root/.local/bin:$PATH" python .ci/gitea.py setup --debug
# Connect Gitea to the network (idempotent)
docker network connect conan-net gitea-conan-persistent 2>/dev/null || true
# Verify it's connected
docker inspect gitea-conan-persistent -f '{{range $net,$v := .NetworkSettings.Networks}}{{$net}} {{end}}'
# - name: Test Gitea from build container
# run: |
# # Show conan-net details
# echo "=== Docker network 'conan-net' details ==="
# docker network inspect conan-net
#
# # Show what networks Gitea is connected to
# echo "=== Gitea container networks ==="
# docker inspect gitea-conan-persistent -f '{{json .NetworkSettings.Networks}}' | python -m json.tool
#
# # Check if DNS resolution works without adding to conan-net
# docker run --rm alpine nslookup gitea-conan-persistent || echo "⚠️ DNS resolution failed without conan-net"
#
# docker run --rm --network conan-net alpine sh -c "
# # First verify connectivity works
# apk add --no-cache curl >/dev/null 2>&1
# echo 'Testing DNS resolution...'
# nslookup gitea-conan-persistent
# echo 'Testing HTTP connection...'
# curl -s http://gitea-conan-persistent:3000 | head -n1
# "
# docker run --rm --network conan-net conanio/gcc11 bash -xec "
# # Configure Conan using the resolved IP
# conan remote add gitea-local http://gitea-conan-persistent:3000/api/packages/conan/conan
# conan user -p conan-pass-2024 -r gitea-local conan
#
# # Enable revisions to match the server expectation
# conan config set general.revisions_enabled=1
#
# # Test package upload/download
# conan install zlib/1.3.1@ --build=zlib
# conan upload 'zlib/*' --all -r gitea-local --confirm
# conan remove 'zlib/*' -f
# conan install zlib/1.3.1@ -r gitea-local
# echo '✅ Container-to-container test successful!'# - name: Build using Docker
# "
# run: /bin/bash release-builder.sh
#
# - name: Stop Container (Cleanup)
# if: always()
# run: |
# echo "Running cleanup script: $JOB_CLEANUP_SCRIPT"
# /bin/bash -e -x "$JOB_CLEANUP_SCRIPT"
# CLEANUP_EXIT_CODE=$?
#
# if [[ "$CLEANUP_EXIT_CODE" -eq 0 ]]; then
# echo "Cleanup script succeeded."
# rm -f "$JOB_CLEANUP_SCRIPT"
# echo "Cleanup script removed."
# else
# echo "⚠️ Cleanup script failed! Keeping for debugging: $JOB_CLEANUP_SCRIPT"
# fi
#
# if [[ "${DEBUG_BUILD_CONTAINERS_AFTER_CLEANUP}" == "1" ]]; then
# echo "🔍 Checking for leftover containers..."
# BUILD_CONTAINERS=$(docker ps --format '{{.Names}}' | grep '^xahaud_cached_builder' || echo "")
#
# if [[ -n "$BUILD_CONTAINERS" ]]; then
# echo "⚠️ WARNING: Some build containers are still running"
# echo "$BUILD_CONTAINERS"
# else
# echo "✅ No build containers found"
# fi
# fi
- name: Build using Docker
run: /bin/bash release-builder.sh
tests:
runs-on: [self-hosted, vanity]
needs: [build, checkout]
defaults:
run:
working-directory: ${{ needs.checkout.outputs.checkout_path }}
needs: build
steps:
- name: Unit tests
run: PATH="/root/.local/bin:$PATH" python .ci/gitea.py test --debug
- name: Unit tests
run: /bin/bash docker-unit-tests.sh
cleanup:
runs-on: [self-hosted, vanity]
needs: [tests, checkout]
if: always()
steps:
- name: Cleanup workspace
run: |
CHECKOUT_PATH="${{ needs.checkout.outputs.checkout_path }}"
PATH="/root/.local/bin:$PATH" python "${CHECKOUT_PATH}/.ci/gitea.py" teardown --debug
echo "Cleaning workspace for ${CHECKOUT_PATH}"
rm -rf "${{ github.workspace }}/${CHECKOUT_PATH}"

View File

@@ -4,32 +4,21 @@ on: [push, pull_request]
jobs:
check:
runs-on: ubuntu-22.04
runs-on: ubuntu-20.04
env:
CLANG_VERSION: 10
steps:
- uses: actions/checkout@v3
# - name: Install clang-format
# run: |
# codename=$( lsb_release --codename --short )
# sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
# deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
# deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
# EOF
# wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
# sudo apt-get update -y
# sudo apt-get install -y clang-format-${CLANG_VERSION}
# Temporary fix until this commit is merged
# https://github.com/XRPLF/rippled/commit/552377c76f55b403a1c876df873a23d780fcc81c
- name: Download and install clang-format
- name: Install clang-format
run: |
sudo apt-get update -y
sudo apt-get install -y libtinfo5
curl -LO https://github.com/llvm/llvm-project/releases/download/llvmorg-10.0.1/clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04.tar.xz
tar -xf clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04.tar.xz
sudo mv clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04 /opt/clang-10
sudo ln -s /opt/clang-10/bin/clang-format /usr/local/bin/clang-format-10
codename=$( lsb_release --codename --short )
sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
EOF
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
sudo apt-get update
sudo apt-get install clang-format-${CLANG_VERSION}
- name: Format src/ripple
run: find src/ripple -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 clang-format-${CLANG_VERSION} -i
- name: Format src/test
@@ -41,7 +30,7 @@ jobs:
git diff --exit-code | tee "clang-format.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
continue-on-error: true
with:
name: clang-format.patch

25
.github/workflows/doxygen.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
name: Build and publish Doxygen documentation
on:
push:
branches:
- dev
jobs:
job:
runs-on: ubuntu-latest
container:
image: docker://rippleci/rippled-ci-builder:2944b78d22db
steps:
- name: checkout
uses: actions/checkout@v2
- name: build
run: |
mkdir build
cd build
cmake -DBoost_NO_BOOST_CMAKE=ON ..
cmake --build . --target docs --parallel $(nproc)
- name: publish
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: build/docs/html

View File

@@ -18,7 +18,7 @@ jobs:
git diff --exit-code | tee "levelization.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
continue-on-error: true
with:
name: levelization.patch

95
.github/workflows/nix.yml vendored Normal file
View File

@@ -0,0 +1,95 @@
name: nix
on: [push, pull_request]
jobs:
test:
strategy:
matrix:
platform:
- ubuntu-latest
- macos-12
generator:
- Ninja
configuration:
- Release
runs-on: ${{ matrix.platform }}
env:
build_dir: .build
steps:
- name: checkout
uses: actions/checkout@v3
- name: install Ninja on Linux
if: matrix.generator == 'Ninja' && runner.os == 'Linux'
run: sudo apt install ninja-build
- name: install Ninja on OSX
if: matrix.generator == 'Ninja' && runner.os == 'macOS'
run: brew install ninja
- name: install nproc on OSX
if: runner.os == 'macOS'
run: brew install coreutils
- name: choose Python
uses: actions/setup-python@v3
with:
python-version: 3.9
- name: learn Python cache directory
id: pip-cache
run: |
sudo pip install --upgrade pip
echo "::set-output name=dir::$(pip cache dir)"
- name: restore Python cache directory
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-${{ hashFiles('.github/workflows/nix.yml') }}
- name: install Conan
run: pip install wheel 'conan>=1.52.0'
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
python --version
conan --version
cmake --version
env
- name: configure Conan
run: |
conan profile new default --detect
conan profile update settings.compiler.cppstd=20 default
- name: configure Conan on Linux
if: runner.os == 'Linux'
run: |
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: learn Conan cache directory
id: conan-cache
run: |
echo "::set-output name=dir::$(conan config get storage.path)"
- name: restore Conan cache directory
uses: actions/cache@v2
with:
path: ${{ steps.conan-cache.outputs.dir }}
key: ${{ hashFiles('~/.conan/profiles/default', 'conanfile.py', 'external/rocksdb/*', '.github/workflows/nix.yml') }}
- name: export RocksDB
run: conan export external/rocksdb
- name: install dependencies
run: |
mkdir ${build_dir}
cd ${build_dir}
conan install .. --build missing --settings build_type=${{ matrix.configuration }} --profile:build default --profile:host default
- name: configure
run: |
cd ${build_dir}
cmake \
-G ${{ matrix.generator }} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ matrix.configuration }} \
-Dassert=ON \
-Dcoverage=OFF \
-Dreporting=OFF \
-Dunity=OFF \
..
- name: build
run: |
cmake --build ${build_dir} --target rippled --parallel $(nproc)
- name: test
run: |
${build_dir}/rippled --unittest --unittest-jobs $(nproc)

89
.github/workflows/windows.yml vendored Normal file
View File

@@ -0,0 +1,89 @@
name: windows
# We have disabled this workflow because it fails in our CI Windows
# environment, but we cannot replicate the failure in our personal Windows
# test environments, nor have we gone through the trouble of setting up an
# interactive CI Windows environment.
# We welcome contributions to diagnose or debug the problems on Windows. Until
# then, we leave this tombstone as a reminder that we have tried (but failed)
# to write a reliable test for Windows.
# on: [push, pull_request]
jobs:
test:
strategy:
matrix:
generator:
- Visual Studio 16 2019
configuration:
- Release
runs-on: windows-2019
env:
build_dir: .build
steps:
- name: checkout
uses: actions/checkout@v3
- name: choose Python
uses: actions/setup-python@v3
with:
python-version: 3.9
- name: learn Python cache directory
id: pip-cache
run: |
pip install --upgrade pip
echo "::set-output name=dir::$(pip cache dir)"
- name: restore Python cache directory
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-${{ hashFiles('.github/workflows/windows.yml') }}
- name: install Conan
run: pip install wheel 'conan>=1.52.0'
- name: check environment
run: |
$env:PATH -split ';'
python --version
conan --version
cmake --version
dir env:
- name: configure Conan
run: |
conan profile new default --detect
conan profile update settings.compiler.cppstd=20 default
conan profile update settings.compiler.runtime=MT default
conan profile update settings.compiler.toolset=v141 default
- name: learn Conan cache directory
id: conan-cache
run: |
echo "::set-output name=dir::$(conan config get storage.path)"
- name: restore Conan cache directory
uses: actions/cache@v2
with:
path: ${{ steps.conan-cache.outputs.dir }}
key: ${{ hashFiles('~/.conan/profiles/default', 'conanfile.py', 'external/rocksdb/*', '.github/workflows/windows.yml') }}
- name: export RocksDB
run: conan export external/rocksdb
- name: install dependencies
run: |
mkdir $env:build_dir
cd $env:build_dir
conan install .. --build missing --settings build_type=${{ matrix.configuration }}
- name: configure
run: |
$env:build_dir
cd $env:build_dir
pwd
ls
cmake `
-G "${{ matrix.generator }}" `
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake `
-Dassert=ON `
-Dreporting=OFF `
-Dunity=OFF `
..
- name: build
run: |
cmake --build $env:build_dir --target rippled --config ${{ matrix.configuration }} --parallel $env:NUMBER_OF_PROCESSORS
- name: test
run: |
& "$env:build_dir\${{ matrix.configuration }}\rippled.exe" --unittest

View File

@@ -1,116 +0,0 @@
name: MacOS - GA Runner
on:
push:
branches: ["dev", "candidate", "release"]
pull_request:
branches: ["dev", "candidate", "release"]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
test:
strategy:
matrix:
generator:
- Ninja
configuration:
- Debug
runs-on: macos-15
env:
build_dir: .build
# Bump this number to invalidate all caches globally.
CACHE_VERSION: 1
MAIN_BRANCH_NAME: dev
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Conan
run: |
brew install conan@1
# Add Conan 1 to the PATH for this job
echo "$(brew --prefix conan@1)/bin" >> $GITHUB_PATH
- name: Install Coreutils
run: |
brew install coreutils
echo "Num proc: $(nproc)"
- name: Install Ninja
if: matrix.generator == 'Ninja'
run: brew install ninja
- name: Install Python
run: |
if which python3 > /dev/null 2>&1; then
echo "Python 3 executable exists"
python3 --version
else
brew install python@3.12
fi
# Create 'python' symlink if it doesn't exist (for tools expecting 'python')
if ! which python > /dev/null 2>&1; then
sudo ln -sf $(which python3) /usr/local/bin/python
fi
- name: Install CMake
run: |
if which cmake > /dev/null 2>&1; then
echo "cmake executable exists"
cmake --version
else
brew install cmake
fi
- name: Install ccache
run: brew install ccache
- name: Configure ccache
uses: ./.github/actions/xahau-configure-ccache
with:
max_size: 2G
hash_dir: true
compiler_check: content
- name: Check environment
run: |
echo "PATH:"
echo "${PATH}" | tr ':' '\n'
which python && python --version || echo "Python not found"
which conan && conan --version || echo "Conan not found"
which cmake && cmake --version || echo "CMake not found"
clang --version
ccache --version
echo "---- Full Environment ----"
env
- name: Configure Conan
run: |
conan profile new default --detect || true # Ignore error if profile exists
conan profile update settings.compiler.cppstd=20 default
- name: Install dependencies
uses: ./.github/actions/xahau-ga-dependencies
with:
configuration: ${{ matrix.configuration }}
build_dir: ${{ env.build_dir }}
compiler-id: clang
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
- name: Build
uses: ./.github/actions/xahau-ga-build
with:
generator: ${{ matrix.generator }}
configuration: ${{ matrix.configuration }}
build_dir: ${{ env.build_dir }}
compiler-id: clang
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
- name: Test
run: |
${{ env.build_dir }}/rippled --unittest --unittest-jobs $(nproc)

View File

@@ -1,123 +0,0 @@
name: Nix - GA Runner
on:
push:
branches: ["dev", "candidate", "release"]
pull_request:
branches: ["dev", "candidate", "release"]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
build-job:
runs-on: ubuntu-latest
outputs:
artifact_name: ${{ steps.set-artifact-name.outputs.artifact_name }}
strategy:
fail-fast: false
matrix:
compiler: [gcc]
configuration: [Debug]
include:
- compiler: gcc
cc: gcc-11
cxx: g++-11
compiler_id: gcc-11
env:
build_dir: .build
# Bump this number to invalidate all caches globally.
CACHE_VERSION: 1
MAIN_BRANCH_NAME: dev
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install build dependencies
run: |
sudo apt-get update
sudo apt-get install -y ninja-build ${{ matrix.cc }} ${{ matrix.cxx }} ccache
# Install specific Conan version needed
pip install --upgrade "conan<2"
- name: Configure ccache
uses: ./.github/actions/xahau-configure-ccache
with:
max_size: 2G
hash_dir: true
compiler_check: content
- name: Configure Conan
run: |
conan profile new default --detect || true # Ignore error if profile exists
conan profile update settings.compiler.cppstd=20 default
conan profile update settings.compiler=${{ matrix.compiler }} default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile update env.CC=/usr/bin/${{ matrix.cc }} default
conan profile update env.CXX=/usr/bin/${{ matrix.cxx }} default
conan profile update conf.tools.build:compiler_executables='{"c": "/usr/bin/${{ matrix.cc }}", "cpp": "/usr/bin/${{ matrix.cxx }}"}' default
# Set correct compiler version based on matrix.compiler
if [ "${{ matrix.compiler }}" = "gcc" ]; then
conan profile update settings.compiler.version=11 default
elif [ "${{ matrix.compiler }}" = "clang" ]; then
conan profile update settings.compiler.version=14 default
fi
# Display profile for verification
conan profile show default
- name: Check environment
run: |
echo "PATH:"
echo "${PATH}" | tr ':' '\n'
which conan && conan --version || echo "Conan not found"
which cmake && cmake --version || echo "CMake not found"
which ${{ matrix.cc }} && ${{ matrix.cc }} --version || echo "${{ matrix.cc }} not found"
which ${{ matrix.cxx }} && ${{ matrix.cxx }} --version || echo "${{ matrix.cxx }} not found"
which ccache && ccache --version || echo "ccache not found"
echo "---- Full Environment ----"
env
- name: Install dependencies
uses: ./.github/actions/xahau-ga-dependencies
with:
configuration: ${{ matrix.configuration }}
build_dir: ${{ env.build_dir }}
compiler-id: ${{ matrix.compiler_id }}
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
- name: Build
uses: ./.github/actions/xahau-ga-build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
build_dir: ${{ env.build_dir }}
cc: ${{ matrix.cc }}
cxx: ${{ matrix.cxx }}
compiler-id: ${{ matrix.compiler_id }}
cache_version: ${{ env.CACHE_VERSION }}
main_branch: ${{ env.MAIN_BRANCH_NAME }}
- name: Set artifact name
id: set-artifact-name
run: |
ARTIFACT_NAME="build-output-nix-${{ github.run_id }}-${{ matrix.compiler }}-${{ matrix.configuration }}"
echo "artifact_name=${ARTIFACT_NAME}" >> "$GITHUB_OUTPUT"
echo "Using artifact name: ${ARTIFACT_NAME}"
- name: Debug build directory
run: |
echo "Checking build directory contents: ${{ env.build_dir }}"
ls -la ${{ env.build_dir }} || echo "Build directory not found or empty"
- name: Run tests
run: |
# Ensure the binary exists before trying to run
if [ -f "${{ env.build_dir }}/rippled" ]; then
${{ env.build_dir }}/rippled --unittest --unittest-jobs $(nproc)
else
echo "Error: rippled executable not found in ${{ env.build_dir }}"
exit 1
fi

2
.gitignore vendored
View File

@@ -114,5 +114,3 @@ pkg_out
pkg
CMakeUserPresets.json
bld.rippled/
generated

View File

@@ -3,7 +3,7 @@
"C_Cpp.clang_format_path": ".clang-format",
"C_Cpp.clang_format_fallbackStyle": "{ ColumnLimit: 0 }",
"[cpp]":{
"editor.wordBasedSuggestions": "off",
"editor.wordBasedSuggestions": false,
"editor.suggest.insertMode": "replace",
"editor.semanticHighlighting.enabled": true,
"editor.tabSize": 4,

399
BUILD.md
View File

@@ -1,12 +1,3 @@
> These instructions assume you have a C++ development environment ready
> with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up
> on Linux, macOS, or Windows, see [our guide](./docs/build/environment.md).
>
> These instructions also assume a basic familiarity with Conan and CMake.
> If you are unfamiliar with Conan,
> you can read our [crash course](./docs/build/conan.md)
> or the official [Getting Started][3] walkthrough.
## Branches
For a stable release, choose the `master` branch or one of the [tagged
@@ -22,297 +13,216 @@ For the latest release candidate, choose the `release` branch.
git checkout release
```
For the latest set of untested features, or to contribute, choose the `develop`
branch.
If you are contributing or want the latest set of untested features,
then use the `develop` branch.
```
git checkout develop
```
## Minimum Requirements
## Platforms
- [Python 3.7](https://www.python.org/downloads/)
- [Conan 1.55](https://conan.io/downloads.html)
- [CMake 3.16](https://cmake.org/download/)
We do not recommend Windows for rippled production use at this time. Currently,
the Ubuntu platform has received the highest level of quality assurance,
testing, and support. Additionally, 32-bit Windows development is not supported.
`rippled` is written in the C++20 dialect and includes the `<concepts>` header.
The [minimum compiler versions][2] required are:
Visual Studio 2022 is not yet supported.
This is because rippled is not compatible with [Boost][] versions 1.78 or 1.79,
but Conan cannot build Boost versions released earlier than them with VS 2022.
We expect that rippled will be compatible with Boost 1.80, which should be
released in August 2022.
Until then, we advise Windows developers to use Visual Studio 2019.
| Compiler | Version |
|-------------|---------|
| GCC | 10 |
| Clang | 13 |
| Apple Clang | 13.1.6 |
| MSVC | 19.23 |
We don't recommend Windows for `rippled` production at this time. As of
January 2023, Ubuntu has the highest level of quality assurance, testing,
and support.
Windows developers should use Visual Studio 2019. `rippled` isn't
compatible with [Boost](https://www.boost.org/) 1.78 or 1.79, and Conan
can't build earlier Boost versions.
**Note:** 32-bit Windows development isn't supported.
[Boost]: https://www.boost.org/
## Steps
## Prerequisites
To build this package, you will need Python (>= 3.7),
[Conan][] (>= 1.52), and [CMake][] (>= 3.16).
If you are unfamiliar with Conan,
there is a crash course at the end of this document.
[Conan]: https://conan.io/downloads.html
[CMake]: https://cmake.org/download/
You'll need to compile in the C++20 dialect:
```
conan profile update settings.cppstd=20 default
```
Linux developers will commonly have a default Conan [profile][] that compiles
with GCC and links with libstdc++.
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
then you will need to choose the `libstdc++11` ABI:
```
conan profile update settings.compiler.libcxx=libstdc++11 default
```
We find it necessary to use the x64 native build tools on Windows.
An easy way to do that is to run the shortcut "x64 Native Tools Command
Prompt" for the version of Visual Studio that you have installed.
Windows developers must build rippled and its dependencies for the x64
architecture:
```
conan profile update settings.arch=x86_64 default
```
### Set Up Conan
## How to build and test
1. (Optional) If you've never used Conan, use autodetect to set up a default profile.
Let's start with a couple of examples of common workflows.
The first is for a single-configuration generator (e.g. Unix Makefiles) on
Linux or MacOS:
```
conan profile new default --detect
```
```
conan export external/rocksdb
mkdir .build
cd .build
conan install .. --output-folder . --build missing --settings build_type=Release
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
cmake --build .
./rippled --unittest
```
2. Update the compiler settings.
The second is for a multi-configuration generator (e.g. Visual Studio) on
Windows:
```
conan profile update settings.compiler.cppstd=20 default
```
```
conan export external/rocksdb
mkdir .build
cd .build
conan install .. --output-folder . --build missing --settings build_type=Release --settings compiler.runtime=MT
conan install .. --output-folder . --build missing --settings build_type=Debug --settings compiler.runtime=MTd
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
cmake --build . --config Release
cmake --build . --config Debug
./Release/rippled --unittest
./Debug/rippled --unittest
```
Linux developers will commonly have a default Conan [profile][] that compiles
with GCC and links with libstdc++.
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
then you will need to choose the `libstdc++11` ABI.
Here we explain the individual steps:
```
conan profile update settings.compiler.libcxx=libstdc++11 default
```
1. Export our [Conan recipe for RocksDB](./external/rocksdb).
On Windows, you should use the x64 native build tools.
An easy way to do that is to run the shortcut "x64 Native Tools Command
Prompt" for the version of Visual Studio that you have installed.
It builds version 6.27.3, which, as of July 8, 2022,
is not available in [Conan Center](https://conan.io/center/rocksdb).
Windows developers must also build `rippled` and its dependencies for the x64
architecture.
1. Create a build directory (and move into it).
```
conan profile update settings.arch=x86_64 default
```
You can choose any name you want.
3. (Optional) If you have multiple compilers installed on your platform,
make sure that Conan and CMake select the one you want to use.
This setting will set the correct variables (`CMAKE_<LANG>_COMPILER`)
in the generated CMake toolchain file.
Conan will generate some files in what it calls the "install folder".
These files are implementation details that you don't need to worry about.
By default, the install folder is your current working directory.
If you don't move into your build directory before calling Conan,
then you may be annoyed to see it polluting your project root directory
with these files.
To make Conan put them in your build directory,
you'll have to add the option
`--install-folder` or `-if` to every `conan install` command.
```
conan profile update 'conf.tools.build:compiler_executables={"c": "<path>", "cpp": "<path>"}' default
```
It should choose the compiler for dependencies as well,
but not all of them have a Conan recipe that respects this setting (yet).
For the rest, you can set these environment variables:
```
conan profile update env.CC=<path> default
conan profile update env.CXX=<path> default
```
4. Export our [Conan recipe for Snappy](./external/snappy).
It doesn't explicitly link the C++ standard library,
which allows you to statically link it with GCC, if you want.
```
conan export external/snappy snappy/1.1.9@
```
5. Export our [Conan recipe for SOCI](./external/soci).
It patches their CMake to correctly import its dependencies.
```
conan export external/soci soci/4.0.3@
```
### Build and Test
1. Create a build directory and move into it.
```
mkdir .build
cd .build
```
You can use any directory name. Conan treats your working directory as an
install folder and generates files with implementation details.
You don't need to worry about these files, but make sure to change
your working directory to your build directory before calling Conan.
**Note:** You can specify a directory for the installation files by adding
the `install-folder` or `-if` option to every `conan install` command
in the next step.
2. Generate CMake files for every configuration you want to build.
```
conan install .. --output-folder . --build missing --settings build_type=Release
conan install .. --output-folder . --build missing --settings build_type=Debug
```
1. Generate CMake files for every configuration you want to build.
For a single-configuration generator, e.g. `Unix Makefiles` or `Ninja`,
you only need to run this command once.
For a multi-configuration generator, e.g. `Visual Studio`, you may want to
run it more than once.
Each of these commands should also have a different `build_type` setting.
A second command with the same `build_type` setting will overwrite the files
generated by the first. You can pass the build type on the command line with
`--settings build_type=$BUILD_TYPE` or in the profile itself,
under the section `[settings]` with the key `build_type`.
If you are using a Microsoft Visual C++ compiler,
then you will need to ensure consistency between the `build_type` setting
and the `compiler.runtime` setting.
Each of these commands should have a different `build_type` setting.
A second command with the same `build_type` setting will just overwrite
the files generated by the first.
You can pass the build type on the command line with `--settings
build_type=$BUILD_TYPE` or in the profile itself, under the section
`[settings]`, with the key `build_type`.
If you are using a Microsoft Visual C++ compiler, then you will need to
ensure consistency between the `build_type` setting and the
`compiler.runtime` setting.
When `build_type` is `Release`, `compiler.runtime` should be `MT`.
When `build_type` is `Debug`, `compiler.runtime` should be `MTd`.
```
conan install .. --output-folder . --build missing --settings build_type=Release --settings compiler.runtime=MT
conan install .. --output-folder . --build missing --settings build_type=Debug --settings compiler.runtime=MTd
```
1. Configure CMake once.
3. Configure CMake and pass the toolchain file generated by Conan, located at
`$OUTPUT_FOLDER/build/generators/conan_toolchain.cmake`.
For all choices of generator, pass the toolchain file generated by Conan.
It will be located at
`$OUTPUT_FOLDER/build/generators/conan_toolchain.cmake`.
If you are using a single-configuration generator, then pass the CMake
variable [`CMAKE_BUILD_TYPE`][build_type] and make sure it matches the
`build_type` setting you chose in the previous step.
Single-config generators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
```
This step is where you may pass build options for rippled.
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
and make sure it matches the `build_type` setting you chose in the previous
step.
1. Build rippled.
Multi-config gnerators:
```
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
```
**Note:** You can pass build options for `rippled` in this step.
4. Build `rippled`.
For a single-configuration generator, it will build whatever configuration
you passed for `CMAKE_BUILD_TYPE`. For a multi-configuration generator,
you must pass the option `--config` to select the build configuration.
Single-config generators:
```
cmake --build .
```
Multi-config generators:
```
cmake --build . --config Release
cmake --build . --config Debug
```
For a multi-configuration generator, you must pass the option `--config`
to select the build configuration.
For a single-configuration generator, it will build whatever configuration
you passed for `CMAKE_BUILD_TYPE`.
5. Test rippled.
Single-config generators:
```
./rippled --unittest
```
Multi-config generators:
```
./Release/rippled --unittest
./Debug/rippled --unittest
```
The location of `rippled` in your build directory depends on your CMake
generator. Pass `--help` to see the rest of the command line options.
The exact location of rippled in your build directory
depends on your choice of CMake generator.
You can run unit tests by passing `--unittest`.
Pass `--help` to see the rest of the command line options.
## Options
### Options
| Option | Default Value | Description |
| --- | ---| ---|
| `assert` | OFF | Enable assertions.
| `reporting` | OFF | Build the reporting mode feature. |
| `tests` | ON | Build tests. |
| `unity` | ON | Configure a unity build. |
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
The `unity` option allows you to select between [unity][5] and non-unity
builds.
Unity builds may be faster for the first build (at the cost of much
more memory) since they concatenate sources into fewer translation
units.
Non-unity builds may be faster for incremental builds, and can be helpful for
detecting `#include` omissions.
[Unity builds][5] may be faster for the first build
(at the cost of much more memory) since they concatenate sources into fewer
translation units. Non-unity builds may be faster for incremental builds,
and can be helpful for detecting `#include` omissions.
Below are the most commonly used options,
with their default values in parentheses.
- `assert` (OFF): Enable assertions.
- `reporting` (OFF): Build the reporting mode feature.
- `tests` (ON): Build tests.
- `unity` (ON): Configure a [unity build][5].
## Troubleshooting
### Troubleshooting
### Conan
If you have trouble building dependencies after changing Conan settings,
try removing the Conan cache.
```
rm -rf ~/.conan/data
```
### no std::result_of
If your compiler version is recent enough to have removed `std::result_of` as
part of C++20, e.g. Apple Clang 15.0, then you might need to add a preprocessor
definition to your build.
```
conan profile update 'options.boost:extra_b2_flags="define=BOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'conf.tools.build:cflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
```
### recompile with -fPIC
If you get a linker error suggesting that you recompile Boost with
position-independent code, such as:
```
/usr/bin/ld.gold: error: /home/username/.conan/data/boost/1.77.0/_/_/package/.../lib/libboost_container.a(alloc_lib.o):
requires unsupported dynamic reloc 11; recompile with -fPIC
```
Conan most likely downloaded a bad binary distribution of the dependency.
This seems to be a [bug][1] in Conan just for Boost 1.77.0 compiled with GCC
for Linux. The solution is to build the dependency locally by passing
`--build boost` when calling `conan install`.
If you get a linker error like the one below suggesting that you recompile
Boost with position-independent code, the reason is most likely that Conan
downloaded a bad binary distribution of the dependency.
For now, this seems to be a [bug][1] in Conan just for Boost 1.77.0 compiled
with GCC for Linux.
The solution is to build the dependency locally by passing `--build boost`
when calling `conan install`.
```
/usr/bin/ld.gold: error: /home/username/.conan/data/boost/1.77.0/_/_/package/dc8aedd23a0f0a773a5fcdcfe1ae3e89c4205978/lib/libboost_container.a(alloc_lib.o): requires unsupported dynamic reloc 11; recompile with -fPIC
```
## Add a Dependency
## How to add a dependency
If you want to experiment with a new package, follow these steps:
If you want to experiment with a new package, here are the steps to get it
working:
1. Search for the package on [Conan Center](https://conan.io/center/).
2. Modify [`conanfile.py`](./conanfile.py):
- Add a version of the package to the `requires` property.
- Change any default options for the package by adding them to the
`default_options` property (with syntax `'$package:$option': $value`).
3. Modify [`CMakeLists.txt`](./CMakeLists.txt):
- Add a call to `find_package($package REQUIRED)`.
- Link a library from the package to the target `ripple_libs`
(search for the existing call to `target_link_libraries(ripple_libs INTERFACE ...)`).
4. Start coding! Don't forget to include whatever headers you need from the package.
1. In [`conanfile.py`](./conanfile.py):
1. Add a version of the package to the `requires` property.
1. Change any default options for the package by adding them to the
`default_options` property (with syntax `'$package:$option': $value`)
1. In [`CMakeLists.txt`](./CMakeLists.txt):
1. Add a call to `find_package($package REQUIRED)`.
1. Link a library from the package to the target `ripple_libs` (search for
the existing call to `target_link_libraries(ripple_libs INTERFACE ...)`).
1. Start coding! Don't forget to include whatever headers you need from the
package.
## A crash course in CMake and Conan
@@ -439,4 +349,3 @@ but it is more convenient to put them in a [profile][profile].
[search]: https://cmake.org/cmake/help/latest/command/find_package.html#search-procedure
[prefix_path]: https://cmake.org/cmake/help/latest/variable/CMAKE_PREFIX_PATH.html
[profile]: https://docs.conan.io/en/latest/reference/profiles.html

View File

@@ -1,62 +0,0 @@
set (RocksDB_DIR "" CACHE PATH "Root directory of RocksDB distribution")
find_path (RocksDB_INCLUDE_DIR
rocksdb/db.h
PATHS ${RocksDB_DIR})
set (RocksDB_VERSION "")
find_file (RocksDB_VERSION_FILE
rocksdb/version.h
PATHS ${RocksDB_DIR})
if (RocksDB_VERSION_FILE)
file (READ ${RocksDB_VERSION_FILE} _verfile)
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_MAJOR[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION "${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION "0")
endif()
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_MINOR[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION ".${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION ".0")
endif()
if ("${_verfile}" MATCHES "#define[ \\t]+ROCKSDB_PATCH[ \\t]+([0-9]+)")
string (APPEND RocksDB_VERSION ".${CMAKE_MATCH_1}")
else ()
string (APPEND RocksDB_VERSION ".0")
endif()
endif ()
if (RocksDB_USE_STATIC)
list (APPEND RocksDB_NAMES
"${CMAKE_STATIC_LIBRARY_PREFIX}rocksdb${CMAKE_STATIC_LIBRARY_SUFFIX}"
"${CMAKE_STATIC_LIBRARY_PREFIX}rocksdblib${CMAKE_STATIC_LIBRARY_SUFFIX}")
endif ()
list (APPEND RocksDB_NAMES rocksdb)
find_library (RocksDB_LIBRARY NAMES ${RocksDB_NAMES}
PATHS
${RocksDB_DIR}
${RocksDB_DIR}/bin/Release
${RocksDB_DIR}/bin64_vs2013/Release
PATH_SUFFIXES lib lib64)
foreach (_n RocksDB_NAMES)
list (APPEND RocksDB_NAMES_DBG "${_n}_d" "${_n}d")
endforeach ()
find_library (RocksDB_LIBRARY_DEBUG NAMES ${RocksDB_NAMES_DBG}
PATHS
${RocksDB_DIR}
${RocksDB_DIR}/bin/Debug
${RocksDB_DIR}/bin64_vs2013/Debug
PATH_SUFFIXES lib lib64)
include (FindPackageHandleStandardArgs)
find_package_handle_standard_args (RocksDB
REQUIRED_VARS RocksDB_LIBRARY RocksDB_INCLUDE_DIR
VERSION_VAR RocksDB_VERSION)
mark_as_advanced (RocksDB_INCLUDE_DIR RocksDB_LIBRARY)
set (RocksDB_INCLUDE_DIRS ${RocksDB_INCLUDE_DIR})
set (RocksDB_LIBRARIES ${RocksDB_LIBRARY})

View File

@@ -1,18 +0,0 @@
These are modules and sources that support our CMake build.
== FindBoost.cmake ==
In order to facilitate updating to latest releases of boost, we've made a local
copy of the FindBoost cmake module in our repo. The latest official version can
generally be obtained
[here](https://github.com/Kitware/CMake/blob/master/Modules/FindBoost.cmake).
The latest version provided by Kitware can be tailored for use with the
version of CMake that it ships with (typically the next upcoming CMake
release). As such, the latest version from the repository might not work
perfectly with older versions of CMake - for instance, the latest version
might use features or properties only available in the version of CMake that
it ships with. Given this, it's best to test any updates to this module with a few
different versions of cmake.

View File

@@ -23,11 +23,6 @@ else()
message(STATUS "ACL not found, continuing without ACL support")
endif()
add_library(libxrpl INTERFACE)
target_link_libraries(libxrpl INTERFACE xrpl_core)
add_library(xrpl::libxrpl ALIAS libxrpl)
#[===============================[
beast/legacy FILES:
TODO: review these sources for removal or replacement
@@ -397,7 +392,6 @@ target_sources (rippled PRIVATE
src/ripple/app/misc/NegativeUNLVote.cpp
src/ripple/app/misc/NetworkOPs.cpp
src/ripple/app/misc/SHAMapStoreImp.cpp
src/ripple/app/misc/StateAccounting.cpp
src/ripple/app/misc/detail/impl/WorkSSL.cpp
src/ripple/app/misc/impl/AccountTxPaging.cpp
src/ripple/app/misc/impl/AmendmentTable.cpp
@@ -440,7 +434,6 @@ target_sources (rippled PRIVATE
src/ripple/app/tx/impl/CashCheck.cpp
src/ripple/app/tx/impl/Change.cpp
src/ripple/app/tx/impl/ClaimReward.cpp
src/ripple/app/tx/impl/Clawback.cpp
src/ripple/app/tx/impl/CreateCheck.cpp
src/ripple/app/tx/impl/CreateOffer.cpp
src/ripple/app/tx/impl/CreateTicket.cpp
@@ -462,7 +455,6 @@ target_sources (rippled PRIVATE
src/ripple/app/tx/impl/Remit.cpp
src/ripple/app/tx/impl/SetAccount.cpp
src/ripple/app/tx/impl/SetHook.cpp
src/ripple/app/tx/impl/SetRemarks.cpp
src/ripple/app/tx/impl/SetRegularKey.cpp
src/ripple/app/tx/impl/SetSignerList.cpp
src/ripple/app/tx/impl/SetTrust.cpp
@@ -546,9 +538,7 @@ target_sources (rippled PRIVATE
subdir: nodestore
#]===============================]
src/ripple/nodestore/backend/CassandraFactory.cpp
src/ripple/nodestore/backend/RWDBFactory.cpp
src/ripple/nodestore/backend/MemoryFactory.cpp
src/ripple/nodestore/backend/FlatmapFactory.cpp
src/ripple/nodestore/backend/NuDBFactory.cpp
src/ripple/nodestore/backend/NullFactory.cpp
src/ripple/nodestore/backend/RocksDBFactory.cpp
@@ -613,7 +603,6 @@ target_sources (rippled PRIVATE
src/ripple/rpc/handlers/BlackList.cpp
src/ripple/rpc/handlers/BookOffers.cpp
src/ripple/rpc/handlers/CanDelete.cpp
src/ripple/rpc/handlers/Catalogue.cpp
src/ripple/rpc/handlers/Connect.cpp
src/ripple/rpc/handlers/ConsensusInfo.cpp
src/ripple/rpc/handlers/CrawlShards.cpp
@@ -669,7 +658,6 @@ target_sources (rippled PRIVATE
src/ripple/rpc/handlers/ValidatorListSites.cpp
src/ripple/rpc/handlers/Validators.cpp
src/ripple/rpc/handlers/WalletPropose.cpp
src/ripple/rpc/handlers/Catalogue.cpp
src/ripple/rpc/impl/DeliveredAmount.cpp
src/ripple/rpc/impl/Handler.cpp
src/ripple/rpc/impl/LegacyPathFind.cpp
@@ -681,9 +669,6 @@ target_sources (rippled PRIVATE
src/ripple/rpc/impl/ShardVerificationScheduler.cpp
src/ripple/rpc/impl/Status.cpp
src/ripple/rpc/impl/TransactionSign.cpp
src/ripple/rpc/impl/NFTokenID.cpp
src/ripple/rpc/impl/NFTokenOfferID.cpp
src/ripple/rpc/impl/NFTSyntheticSerializer.cpp
#[===============================[
main sources:
subdir: perflog
@@ -722,7 +707,6 @@ if (tests)
src/test/app/BaseFee_test.cpp
src/test/app/Check_test.cpp
src/test/app/ClaimReward_test.cpp
src/test/app/Clawback_test.cpp
src/test/app/CrossingLimits_test.cpp
src/test/app/DeliverMin_test.cpp
src/test/app/DepositAuth_test.cpp
@@ -753,7 +737,6 @@ if (tests)
src/test/app/Path_test.cpp
src/test/app/PayChan_test.cpp
src/test/app/PayStrand_test.cpp
src/test/app/PreviousTxn_test.cpp
src/test/app/PseudoTx_test.cpp
src/test/app/RCLCensorshipDetector_test.cpp
src/test/app/RCLValidations_test.cpp
@@ -761,15 +744,11 @@ if (tests)
src/test/app/Remit_test.cpp
src/test/app/SHAMapStore_test.cpp
src/test/app/SetAuth_test.cpp
src/test/app/SetHook_test.cpp
src/test/app/SetHookTSH_test.cpp
src/test/app/SetRegularKey_test.cpp
src/test/app/SetRemarks_test.cpp
src/test/app/SetTrust_test.cpp
src/test/app/Taker_test.cpp
src/test/app/TheoreticalQuality_test.cpp
src/test/app/Ticket_test.cpp
src/test/app/Touch_test.cpp
src/test/app/Transaction_ordering_test.cpp
src/test/app/TrustAndBalance_test.cpp
src/test/app/TxQ_test.cpp
@@ -777,6 +756,8 @@ if (tests)
src/test/app/ValidatorKeys_test.cpp
src/test/app/ValidatorList_test.cpp
src/test/app/ValidatorSite_test.cpp
src/test/app/SetHook_test.cpp
src/test/app/SetHookTSH_test.cpp
src/test/app/Wildcard_test.cpp
src/test/app/XahauGenesis_test.cpp
src/test/app/tx/apply_test.cpp
@@ -910,13 +891,11 @@ if (tests)
src/test/jtx/impl/rate.cpp
src/test/jtx/impl/regkey.cpp
src/test/jtx/impl/reward.cpp
src/test/jtx/impl/remarks.cpp
src/test/jtx/impl/remit.cpp
src/test/jtx/impl/sendmax.cpp
src/test/jtx/impl/seq.cpp
src/test/jtx/impl/sig.cpp
src/test/jtx/impl/tag.cpp
src/test/jtx/impl/TestHelpers.cpp
src/test/jtx/impl/ticket.cpp
src/test/jtx/impl/token.cpp
src/test/jtx/impl/trust.cpp
@@ -1009,7 +988,6 @@ if (tests)
src/test/rpc/AccountTx_test.cpp
src/test/rpc/AmendmentBlocked_test.cpp
src/test/rpc/Book_test.cpp
src/test/rpc/Catalogue_test.cpp
src/test/rpc/DepositAuthorized_test.cpp
src/test/rpc/DeliveredAmount_test.cpp
src/test/rpc/Feature_test.cpp

View File

@@ -35,17 +35,10 @@ target_link_libraries (opts
$<$<BOOL:${profile}>:-pg>
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
if (jemalloc)
if (static)
set(JEMALLOC_USE_STATIC ON CACHE BOOL "" FORCE)
endif ()
find_package (jemalloc REQUIRED)
target_compile_definitions (opts INTERFACE PROFILE_JEMALLOC)
target_include_directories (opts SYSTEM INTERFACE ${JEMALLOC_INCLUDE_DIRS})
target_link_libraries (opts INTERFACE ${JEMALLOC_LIBRARIES})
get_filename_component (JEMALLOC_LIB_PATH ${JEMALLOC_LIBRARIES} DIRECTORY)
## TODO see if we can use the BUILD_RPATH target property (is it transitive?)
set (CMAKE_BUILD_RPATH ${CMAKE_BUILD_RPATH} ${JEMALLOC_LIB_PATH})
if(jemalloc)
find_package(jemalloc REQUIRED)
target_compile_definitions(opts INTERFACE PROFILE_JEMALLOC)
target_link_libraries(opts INTERFACE jemalloc::jemalloc)
endif ()
if (san)

View File

@@ -1,33 +0,0 @@
#[===================================================================[
NIH prefix path..this is where we will download
and build any ExternalProjects, and they will hopefully
survive across build directory deletion (manual cleans)
#]===================================================================]
string (REGEX REPLACE "[ \\/%]+" "_" gen_for_path ${CMAKE_GENERATOR})
string (TOLOWER ${gen_for_path} gen_for_path)
# HACK: trying to shorten paths for windows CI (which hits 260 MAXPATH easily)
# @see: https://issues.jenkins-ci.org/browse/JENKINS-38706?focusedCommentId=339847
string (REPLACE "visual_studio" "vs" gen_for_path ${gen_for_path})
if (NOT DEFINED NIH_CACHE_ROOT)
if (DEFINED ENV{NIH_CACHE_ROOT})
set (NIH_CACHE_ROOT $ENV{NIH_CACHE_ROOT})
else ()
set (NIH_CACHE_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/.nih_c")
endif ()
endif ()
set (nih_cache_path
"${NIH_CACHE_ROOT}/${gen_for_path}/${CMAKE_CXX_COMPILER_ID}_${CMAKE_CXX_COMPILER_VERSION}")
if (NOT is_multiconfig)
set (nih_cache_path "${nih_cache_path}/${CMAKE_BUILD_TYPE}")
endif ()
file(TO_CMAKE_PATH "${nih_cache_path}" nih_cache_path)
message (STATUS "NIH-EP cache path: ${nih_cache_path}")
## two convenience variables:
set (ep_lib_prefix ${CMAKE_STATIC_LIBRARY_PREFIX})
set (ep_lib_suffix ${CMAKE_STATIC_LIBRARY_SUFFIX})
# this is a setting for FetchContent and needs to be
# a cache variable
# https://cmake.org/cmake/help/latest/module/FetchContent.html#populating-the-content
set (FETCHCONTENT_BASE_DIR ${nih_cache_path} CACHE STRING "" FORCE)

View File

@@ -10,12 +10,7 @@ if (NOT ep_procs)
message (STATUS "Using ${ep_procs} cores for ExternalProject builds.")
endif ()
endif ()
get_property (is_multiconfig GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
if (is_multiconfig STREQUAL "NOTFOUND")
if (${CMAKE_GENERATOR} STREQUAL "Xcode" OR ${CMAKE_GENERATOR} MATCHES "^Visual Studio")
set (is_multiconfig TRUE)
endif ()
endif ()
get_property(is_multiconfig GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
set (CMAKE_CONFIGURATION_TYPES "Debug;Release" CACHE STRING "" FORCE)
if (NOT is_multiconfig)
@@ -49,9 +44,6 @@ elseif ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
message (FATAL_ERROR "This project requires GCC 8 or later")
endif ()
endif ()
if (CMAKE_GENERATOR STREQUAL "Xcode")
set (is_xcode TRUE)
endif ()
if (CMAKE_SYSTEM_NAME STREQUAL "Linux")
set (is_linux TRUE)

View File

@@ -1,52 +0,0 @@
find_package(Boost 1.83 REQUIRED
COMPONENTS
chrono
container
context
coroutine
date_time
filesystem
program_options
regex
system
thread
)
add_library(ripple_boost INTERFACE)
add_library(Ripple::boost ALIAS ripple_boost)
if(XCODE)
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
else()
target_include_directories(ripple_boost SYSTEM BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
endif()
target_link_libraries(ripple_boost
INTERFACE
Boost::boost
Boost::chrono
Boost::container
Boost::coroutine
Boost::date_time
Boost::filesystem
Boost::program_options
Boost::regex
Boost::system
Boost::iostreams
Boost::thread)
if(Boost_COMPILER)
target_link_libraries(ripple_boost INTERFACE Boost::disable_autolinking)
endif()
if(san AND is_clang)
# TODO: gcc does not support -fsanitize-blacklist...can we do something else
# for gcc ?
if(NOT Boost_INCLUDE_DIRS AND TARGET Boost::headers)
get_target_property(Boost_INCLUDE_DIRS Boost::headers INTERFACE_INCLUDE_DIRECTORIES)
endif()
message(STATUS "Adding [${Boost_INCLUDE_DIRS}] to sanitizer blacklist")
file(WRITE ${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt "src:${Boost_INCLUDE_DIRS}/*")
target_compile_options(opts
INTERFACE
# ignore boost headers for sanitizing
-fsanitize-blacklist=${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt)
endif()

View File

@@ -1,22 +0,0 @@
find_package(Protobuf 3.8)
file(MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
set(ccbd ${CMAKE_CURRENT_BINARY_DIR})
set(CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS src/ripple/proto/ripple.proto)
set(CMAKE_CURRENT_BINARY_DIR ${ccbd})
add_library(pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
target_include_directories(pbufs SYSTEM PUBLIC
${CMAKE_BINARY_DIR}/proto_gen
${CMAKE_BINARY_DIR}/proto_gen/src/ripple/proto
)
target_link_libraries(pbufs protobuf::libprotobuf)
target_compile_options(pbufs
PUBLIC
$<$<BOOL:${XCODE}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>
)
add_library(Ripple::pbufs ALIAS pbufs)

View File

@@ -1,62 +0,0 @@
find_package(gRPC 1.23)
#[=================================[
generate protobuf sources for
grpc defs and bundle into a
static lib
#]=================================]
set(GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
file(MAKE_DIRECTORY ${GRPC_GEN_DIR})
set(GRPC_PROTO_SRCS)
set(GRPC_PROTO_HDRS)
set(GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
file(GLOB_RECURSE GRPC_DEFINITION_FILES LIST_DIRECTORIES false "${GRPC_PROTO_ROOT}/*.proto")
foreach(file ${GRPC_DEFINITION_FILES})
get_filename_component(_abs_file ${file} ABSOLUTE)
get_filename_component(_abs_dir ${_abs_file} DIRECTORY)
get_filename_component(_basename ${file} NAME_WE)
get_filename_component(_proto_inc ${GRPC_PROTO_ROOT} DIRECTORY) # updir one level
file(RELATIVE_PATH _rel_root_file ${_proto_inc} ${_abs_file})
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
set(src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
set(src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
set(hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
set(hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
add_custom_command(
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
COMMAND protobuf::protoc
ARGS --grpc_out=${GRPC_GEN_DIR}
--cpp_out=${GRPC_GEN_DIR}
--plugin=protoc-gen-grpc=$<TARGET_FILE:gRPC::grpc_cpp_plugin>
-I ${_proto_inc} -I ${_rel_dir}
${_abs_file}
DEPENDS ${_abs_file} protobuf::protoc gRPC::grpc_cpp_plugin
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
COMMENT "Running gRPC C++ protocol buffer compiler on ${file}"
VERBATIM)
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES GENERATED TRUE)
list(APPEND GRPC_PROTO_SRCS ${src_1} ${src_2})
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
endforeach()
add_library(grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
#target_include_directories(grpc_pbufs PRIVATE src)
target_include_directories(grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
target_link_libraries(grpc_pbufs
"gRPC::grpc++"
# libgrpc is missing references.
absl::random_random
)
target_compile_options(grpc_pbufs
PRIVATE
$<$<BOOL:${MSVC}>:-wd4065>
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
PUBLIC
$<$<BOOL:${MSVC}>:-wd4996>
$<$<BOOL:${XCODE}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
add_library(Ripple::grpc_pbufs ALIAS grpc_pbufs)

View File

@@ -1,52 +1,4 @@
#[===================================================================[
NIH dep: boost
#]===================================================================]
if((NOT DEFINED BOOST_ROOT) AND(DEFINED ENV{BOOST_ROOT}))
set(BOOST_ROOT $ENV{BOOST_ROOT})
endif()
if((NOT DEFINED BOOST_LIBRARYDIR) AND(DEFINED ENV{BOOST_LIBRARYDIR}))
set(BOOST_LIBRARYDIR $ENV{BOOST_LIBRARYDIR})
endif()
file(TO_CMAKE_PATH "${BOOST_ROOT}" BOOST_ROOT)
if(WIN32 OR CYGWIN)
# Workaround for MSVC having two boost versions - x86 and x64 on same PC in stage folders
if((NOT DEFINED BOOST_LIBRARYDIR) AND (DEFINED BOOST_ROOT))
if(IS_DIRECTORY ${BOOST_ROOT}/stage64/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/stage64/lib)
elseif(IS_DIRECTORY ${BOOST_ROOT}/stage/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/stage/lib)
elseif(IS_DIRECTORY ${BOOST_ROOT}/lib)
set(BOOST_LIBRARYDIR ${BOOST_ROOT}/lib)
else()
message(WARNING "Did not find expected boost library dir. "
"Defaulting to ${BOOST_ROOT}")
set(BOOST_LIBRARYDIR ${BOOST_ROOT})
endif()
endif()
endif()
message(STATUS "BOOST_ROOT: ${BOOST_ROOT}")
message(STATUS "BOOST_LIBRARYDIR: ${BOOST_LIBRARYDIR}")
# uncomment the following as needed to debug FindBoost issues:
#set(Boost_DEBUG ON)
#[=========================================================[
boost dynamic libraries don't trivially support @rpath
linking right now (cmake's default), so just force
static linking for macos, or if requested on linux by flag
#]=========================================================]
if(static)
set(Boost_USE_STATIC_LIBS ON)
endif()
set(Boost_USE_MULTITHREADED ON)
if(static AND NOT APPLE)
set(Boost_USE_STATIC_RUNTIME ON)
else()
set(Boost_USE_STATIC_RUNTIME OFF)
endif()
# TBD:
# Boost_USE_DEBUG_RUNTIME: When ON, uses Boost libraries linked against the
find_package(Boost 1.86 REQUIRED
find_package(Boost 1.70 REQUIRED
COMPONENTS
chrono
container
@@ -57,12 +9,12 @@ find_package(Boost 1.86 REQUIRED
program_options
regex
system
iostreams
thread)
thread
)
add_library(ripple_boost INTERFACE)
add_library(Ripple::boost ALIAS ripple_boost)
if(is_xcode)
if(XCODE)
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
else()
@@ -77,7 +29,6 @@ target_link_libraries(ripple_boost
Boost::coroutine
Boost::date_time
Boost::filesystem
Boost::iostreams
Boost::program_options
Boost::regex
Boost::system

View File

@@ -1,28 +0,0 @@
#[===================================================================[
NIH dep: ed25519-donna
#]===================================================================]
add_library (ed25519-donna STATIC
src/ed25519-donna/ed25519.c)
target_include_directories (ed25519-donna
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
$<INSTALL_INTERFACE:include>
PRIVATE
${CMAKE_CURRENT_SOURCE_DIR}/src/ed25519-donna)
#[=========================================================[
NOTE for macos:
https://github.com/floodyberry/ed25519-donna/issues/29
our source for ed25519-donna-portable.h has been
patched to workaround this.
#]=========================================================]
target_link_libraries (ed25519-donna PUBLIC OpenSSL::SSL)
add_library (NIH::ed25519-donna ALIAS ed25519-donna)
target_link_libraries (ripple_libs INTERFACE NIH::ed25519-donna)
#[===========================[
headers installation
#]===========================]
install (
FILES
src/ed25519-donna/ed25519.h
DESTINATION include/ed25519-donna)

File diff suppressed because it is too large Load Diff

View File

@@ -1,47 +0,0 @@
# - Try to find jemalloc
# Once done this will define
# JEMALLOC_FOUND - System has jemalloc
# JEMALLOC_INCLUDE_DIRS - The jemalloc include directories
# JEMALLOC_LIBRARIES - The libraries needed to use jemalloc
if(NOT USE_BUNDLED_JEMALLOC)
find_package(PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_check_modules(PC_JEMALLOC QUIET jemalloc)
endif()
else()
set(PC_JEMALLOC_INCLUDEDIR)
set(PC_JEMALLOC_INCLUDE_DIRS)
set(PC_JEMALLOC_LIBDIR)
set(PC_JEMALLOC_LIBRARY_DIRS)
set(LIMIT_SEARCH NO_DEFAULT_PATH)
endif()
set(JEMALLOC_DEFINITIONS ${PC_JEMALLOC_CFLAGS_OTHER})
find_path(JEMALLOC_INCLUDE_DIR jemalloc/jemalloc.h
PATHS ${PC_JEMALLOC_INCLUDEDIR} ${PC_JEMALLOC_INCLUDE_DIRS}
${LIMIT_SEARCH})
# If we're asked to use static linkage, add libjemalloc.a as a preferred library name.
if(JEMALLOC_USE_STATIC)
list(APPEND JEMALLOC_NAMES
"${CMAKE_STATIC_LIBRARY_PREFIX}jemalloc${CMAKE_STATIC_LIBRARY_SUFFIX}")
endif()
list(APPEND JEMALLOC_NAMES jemalloc)
find_library(JEMALLOC_LIBRARY NAMES ${JEMALLOC_NAMES}
HINTS ${PC_JEMALLOC_LIBDIR} ${PC_JEMALLOC_LIBRARY_DIRS}
${LIMIT_SEARCH})
set(JEMALLOC_LIBRARIES ${JEMALLOC_LIBRARY})
set(JEMALLOC_INCLUDE_DIRS ${JEMALLOC_INCLUDE_DIR})
include(FindPackageHandleStandardArgs)
# handle the QUIETLY and REQUIRED arguments and set JEMALLOC_FOUND to TRUE
# if all listed variables are TRUE
find_package_handle_standard_args(JeMalloc DEFAULT_MSG
JEMALLOC_LIBRARY JEMALLOC_INCLUDE_DIR)
mark_as_advanced(JEMALLOC_INCLUDE_DIR JEMALLOC_LIBRARY)

View File

@@ -1,22 +0,0 @@
find_package (PkgConfig REQUIRED)
pkg_search_module (libarchive_PC QUIET libarchive>=3.4.3)
if(static)
set(LIBARCHIVE_LIB libarchive.a)
else()
set(LIBARCHIVE_LIB archive)
endif()
find_library (archive
NAMES ${LIBARCHIVE_LIB}
HINTS
${libarchive_PC_LIBDIR}
${libarchive_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LIBARCHIVE_INCLUDE_DIR
NAMES archive.h
HINTS
${libarchive_PC_INCLUDEDIR}
${libarchive_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (lz4_PC QUIET liblz4>=1.9)
endif ()
if(static)
set(LZ4_LIB liblz4.a)
else()
set(LZ4_LIB lz4.so)
endif()
find_library (lz4
NAMES ${LZ4_LIB}
HINTS
${lz4_PC_LIBDIR}
${lz4_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (LZ4_INCLUDE_DIR
NAMES lz4.h
HINTS
${lz4_PC_INCLUDEDIR}
${lz4_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (secp256k1_PC QUIET libsecp256k1)
endif ()
if(static)
set(SECP256K1_LIB libsecp256k1.a)
else()
set(SECP256K1_LIB secp256k1)
endif()
find_library(secp256k1
NAMES ${SECP256K1_LIB}
HINTS
${secp256k1_PC_LIBDIR}
${secp256k1_PC_LIBRARY_PATHS}
NO_DEFAULT_PATH)
find_path (SECP256K1_INCLUDE_DIR
NAMES secp256k1.h
HINTS
${secp256k1_PC_INCLUDEDIR}
${secp256k1_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (snappy_PC QUIET snappy>=1.1.7)
endif ()
if(static)
set(SNAPPY_LIB libsnappy.a)
else()
set(SNAPPY_LIB libsnappy.so)
endif()
find_library (snappy
NAMES ${SNAPPY_LIB}
HINTS
${snappy_PC_LIBDIR}
${snappy_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SNAPPY_INCLUDE_DIR
NAMES snappy.h
HINTS
${snappy_PC_INCLUDEDIR}
${snappy_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,19 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
# TBD - currently no soci pkgconfig
#pkg_search_module (soci_PC QUIET libsoci_core>=3.2)
endif ()
if(static)
set(SOCI_LIB libsoci.a)
else()
set(SOCI_LIB libsoci_core.so)
endif()
find_library (soci
NAMES ${SOCI_LIB})
find_path (SOCI_INCLUDE_DIR
NAMES soci/soci.h)
message("SOCI FOUND AT: ${SOCI_LIB}")

View File

@@ -1,24 +0,0 @@
find_package (PkgConfig)
if (PKG_CONFIG_FOUND)
pkg_search_module (sqlite_PC QUIET sqlite3>=3.26.0)
endif ()
if(static)
set(SQLITE_LIB libsqlite3.a)
else()
set(SQLITE_LIB sqlite3.so)
endif()
find_library (sqlite3
NAMES ${SQLITE_LIB}
HINTS
${sqlite_PC_LIBDIR}
${sqlite_PC_LIBRARY_DIRS}
NO_DEFAULT_PATH)
find_path (SQLITE_INCLUDE_DIR
NAMES sqlite3.h
HINTS
${sqlite_PC_INCLUDEDIR}
${sqlite_PC_INCLUDEDIRS}
NO_DEFAULT_PATH)

View File

@@ -1,163 +0,0 @@
#[===================================================================[
NIH dep: libarchive
#]===================================================================]
option (local_libarchive "use local build of libarchive." OFF)
add_library (archive_lib UNKNOWN IMPORTED GLOBAL)
if (NOT local_libarchive)
if (NOT WIN32)
find_package(libarchive_pc REQUIRED)
endif ()
if (archive)
message (STATUS "Found libarchive using pkg-config. Using ${archive}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${archive}
IMPORTED_LOCATION_RELEASE
${archive}
INTERFACE_INCLUDE_DIRECTORIES
${LIBARCHIVE_INCLUDE_DIR})
# pkg-config can return extra info for static lib linking
# this is probably needed/useful generally, but apply
# to APPLE for now (mostly for homebrew)
if (APPLE AND static AND libarchive_PC_STATIC_LIBRARIES)
message(STATUS "NOTE: libarchive static libs: ${libarchive_PC_STATIC_LIBRARIES}")
# also, APPLE seems to need iconv...maybe linux does too (TBD)
target_link_libraries (archive_lib
INTERFACE iconv ${libarchive_PC_STATIC_LIBRARIES})
endif ()
else ()
## now try searching using the minimal find module that cmake provides
find_package(LibArchive 3.4.3 QUIET)
if (LibArchive_FOUND)
if (static)
# find module doesn't find static libs currently, so we re-search
get_filename_component(_loc ${LibArchive_LIBRARY} DIRECTORY)
find_library(_la_static
NAMES libarchive.a archive_static.lib archive.lib
PATHS ${_loc})
if (_la_static)
set (_la_lib ${_la_static})
else ()
message (WARNING "unable to find libarchive static lib - switching to local build")
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
else ()
set (_la_lib ${LibArchive_LIBRARY})
endif ()
if (NOT local_libarchive)
message (STATUS "Found libarchive using module/config. Using ${_la_lib}.")
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${_la_lib}
IMPORTED_LOCATION_RELEASE
${_la_lib}
INTERFACE_INCLUDE_DIRECTORIES
${LibArchive_INCLUDE_DIRS})
endif ()
else ()
set (local_libarchive ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif()
if (local_libarchive)
set (lib_post "")
if (MSVC)
set (lib_post "_static")
endif ()
ExternalProject_Add (libarchive
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libarchive/libarchive.git
GIT_TAG v3.4.3
CMAKE_ARGS
# passing the compiler seems to be needed for windows CI, sadly
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DENABLE_LZ4=ON
-ULZ4_*
-DLZ4_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
# because we are building a static lib, this lz4 library doesn't
# actually matter since you can't generally link static libs to other static
# libs. The include files are needed, but the library itself is not (until
# we link our application, at which point we use the lz4 we built above).
# nonetheless, we need to provide a library to libarchive else it will
# NOT include lz4 support when configuring
-DLZ4_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-DENABLE_WERROR=OFF
-DENABLE_TAR=OFF
-DENABLE_TAR_SHARED=OFF
-DENABLE_INSTALL=ON
-DENABLE_NETTLE=OFF
-DENABLE_OPENSSL=OFF
-DENABLE_LZO=OFF
-DENABLE_LZMA=OFF
-DENABLE_ZLIB=OFF
-DENABLE_BZip2=OFF
-DENABLE_LIBXML2=OFF
-DENABLE_EXPAT=OFF
-DENABLE_PCREPOSIX=OFF
-DENABLE_LibGCC=OFF
-DENABLE_CNG=OFF
-DENABLE_CPIO=OFF
-DENABLE_CPIO_SHARED=OFF
-DENABLE_CAT=OFF
-DENABLE_CAT_SHARED=OFF
-DENABLE_XATTR=OFF
-DENABLE_ACL=OFF
-DENABLE_ICONV=OFF
-DENABLE_TEST=OFF
-DENABLE_COVERAGE=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target archive_static
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/libarchive/$<CONFIG>/${ep_lib_prefix}archive${lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/libarchive
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
<BINARY_DIR>/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (libarchive BINARY_DIR)
ExternalProject_Get_Property (libarchive SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (libarchive)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/libarchive)
set_target_properties (archive_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/libarchive/${ep_lib_prefix}archive${lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/libarchive
INTERFACE_COMPILE_DEFINITIONS
LIBARCHIVE_STATIC)
endif()
add_dependencies (archive_lib libarchive)
target_link_libraries (archive_lib INTERFACE lz4_lib)
target_link_libraries (ripple_libs INTERFACE archive_lib)
exclude_if_included (libarchive)
exclude_if_included (archive_lib)

View File

@@ -1,79 +0,0 @@
#[===================================================================[
NIH dep: lz4
#]===================================================================]
add_library (lz4_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(lz4)
endif()
if(lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${lz4}
IMPORTED_LOCATION_RELEASE
${lz4}
INTERFACE_INCLUDE_DIRECTORIES
${LZ4_INCLUDE_DIR})
else()
ExternalProject_Add (lz4
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/lz4/lz4.git
GIT_TAG v1.9.2
SOURCE_SUBDIR contrib/cmake_unofficial
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_STATIC_LIBS=ON
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--target lz4_static
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}lz4$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}lz4${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}lz4_d${ep_lib_suffix}
)
ExternalProject_Get_Property (lz4 BINARY_DIR)
ExternalProject_Get_Property (lz4 SOURCE_DIR)
file (MAKE_DIRECTORY ${SOURCE_DIR}/lz4)
set_target_properties (lz4_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}lz4_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}lz4${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/lib)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (lz4)
endif ()
add_dependencies (lz4_lib lz4)
target_link_libraries (ripple_libs INTERFACE lz4_lib)
exclude_if_included (lz4)
endif()
exclude_if_included (lz4_lib)

View File

@@ -1,31 +0,0 @@
#[===================================================================[
NIH dep: nudb
NuDB is header-only, thus is an INTERFACE lib in CMake.
TODO: move the library definition into NuDB repo and add
proper targets and export/install
#]===================================================================]
if (is_root_project) # NuDB not needed in the case of xrpl_core inclusion build
add_library (nudb INTERFACE)
FetchContent_Declare(
nudb_src
GIT_REPOSITORY https://github.com/CPPAlliance/NuDB.git
GIT_TAG 2.0.5
)
FetchContent_GetProperties(nudb_src)
if(NOT nudb_src_POPULATED)
message (STATUS "Pausing to download NuDB...")
FetchContent_Populate(nudb_src)
endif()
file(TO_CMAKE_PATH "${nudb_src_SOURCE_DIR}" nudb_src_SOURCE_DIR)
# specify as system includes so as to avoid warnings
target_include_directories (nudb SYSTEM INTERFACE ${nudb_src_SOURCE_DIR}/include)
target_link_libraries (nudb
INTERFACE
Boost::thread
Boost::system)
add_library (NIH::nudb ALIAS nudb)
target_link_libraries (ripple_libs INTERFACE NIH::nudb)
endif ()

View File

@@ -1,48 +0,0 @@
#[===================================================================[
NIH dep: openssl
#]===================================================================]
#[===============================================[
OPENSSL_ROOT_DIR is the only variable that
FindOpenSSL honors for locating, so convert any
OPENSSL_ROOT vars to this
#]===============================================]
if (NOT DEFINED OPENSSL_ROOT_DIR)
if (DEFINED ENV{OPENSSL_ROOT})
set (OPENSSL_ROOT_DIR $ENV{OPENSSL_ROOT})
elseif (HOMEBREW)
execute_process (COMMAND ${HOMEBREW} --prefix openssl
OUTPUT_VARIABLE OPENSSL_ROOT_DIR
OUTPUT_STRIP_TRAILING_WHITESPACE)
endif ()
file (TO_CMAKE_PATH "${OPENSSL_ROOT_DIR}" OPENSSL_ROOT_DIR)
endif ()
if (static)
set (OPENSSL_USE_STATIC_LIBS ON)
endif ()
set (OPENSSL_MSVC_STATIC_RT ON)
find_package (OpenSSL 1.1.1 REQUIRED)
target_link_libraries (ripple_libs
INTERFACE
OpenSSL::SSL
OpenSSL::Crypto)
# disable SSLv2...this can also be done when building/configuring OpenSSL
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2)
#[=========================================================[
https://gitlab.kitware.com/cmake/cmake/issues/16885
depending on how openssl is built, it might depend
on zlib. In fact, the openssl find package should
figure this out for us, but it does not currently...
so let's add zlib ourselves to the lib list
TODO: investigate linking to static zlib for static
build option
#]=========================================================]
find_package (ZLIB)
set (has_zlib FALSE)
if (TARGET ZLIB::ZLIB)
set_target_properties(OpenSSL::Crypto PROPERTIES
INTERFACE_LINK_LIBRARIES ZLIB::ZLIB)
set (has_zlib TRUE)
endif ()

View File

@@ -1,70 +0,0 @@
if(reporting)
find_package(PostgreSQL)
if(NOT PostgreSQL_FOUND)
message("find_package did not find postgres")
find_library(postgres NAMES pq libpq libpq-dev pq-dev postgresql-devel)
find_path(libpq-fe NAMES libpq-fe.h PATH_SUFFIXES postgresql pgsql include)
if(NOT libpq-fe_FOUND OR NOT postgres_FOUND)
message("No system installed Postgres found. Will build")
add_library(postgres SHARED IMPORTED GLOBAL)
add_library(pgport SHARED IMPORTED GLOBAL)
add_library(pgcommon SHARED IMPORTED GLOBAL)
ExternalProject_Add(postgres_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/postgres/postgres.git
GIT_TAG REL_14_5
CONFIGURE_COMMAND ./configure --without-readline > /dev/null
BUILD_COMMAND ${CMAKE_COMMAND} -E env --unset=MAKELEVEL make
UPDATE_COMMAND ""
BUILD_IN_SOURCE 1
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/src/interfaces/libpq/${ep_lib_prefix}pq.a
<BINARY_DIR>/src/common/${ep_lib_prefix}pgcommon.a
<BINARY_DIR>/src/port/${ep_lib_prefix}pgport.a
LOG_BUILD TRUE
)
ExternalProject_Get_Property (postgres_src SOURCE_DIR)
ExternalProject_Get_Property (postgres_src BINARY_DIR)
set (postgres_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${postgres_src_SOURCE_DIR})
list(APPEND INCLUDE_DIRS
${SOURCE_DIR}/src/include
${SOURCE_DIR}/src/interfaces/libpq
)
set_target_properties(postgres PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/interfaces/libpq/${ep_lib_prefix}pq.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
set_target_properties(pgcommon PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/common/${ep_lib_prefix}pgcommon.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
set_target_properties(pgport PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/src/port/${ep_lib_prefix}pgport.a
INTERFACE_INCLUDE_DIRECTORIES
"${INCLUDE_DIRS}"
)
add_dependencies(postgres postgres_src)
add_dependencies(pgcommon postgres_src)
add_dependencies(pgport postgres_src)
file(TO_CMAKE_PATH "${postgres_src_SOURCE_DIR}" postgres_src_SOURCE_DIR)
target_link_libraries(ripple_libs INTERFACE postgres pgcommon pgport)
else()
message("Found system installed Postgres via find_libary")
target_include_directories(ripple_libs INTERFACE ${libpq-fe})
target_link_libraries(ripple_libs INTERFACE ${postgres})
endif()
else()
message("Found system installed Postgres via find_package")
target_include_directories(ripple_libs INTERFACE ${PostgreSQL_INCLUDE_DIRS})
target_link_libraries(ripple_libs INTERFACE ${PostgreSQL_LIBRARIES})
endif()
endif()

View File

@@ -1,155 +1,22 @@
#[===================================================================[
import protobuf (lib and compiler) and create a lib
from our proto message definitions. If the system protobuf
is not found, fallback on EP to download and build a version
from official source.
#]===================================================================]
find_package(Protobuf 3.8)
if (static)
set (Protobuf_USE_STATIC_LIBS ON)
endif ()
find_package (Protobuf 3.8)
if (is_multiconfig)
set(protobuf_protoc_lib ${Protobuf_PROTOC_LIBRARIES})
else ()
string(TOUPPER ${CMAKE_BUILD_TYPE} upper_cmake_build_type)
set(protobuf_protoc_lib ${Protobuf_PROTOC_LIBRARY_${upper_cmake_build_type}})
endif ()
if (local_protobuf OR NOT (Protobuf_FOUND AND Protobuf_PROTOC_EXECUTABLE AND protobuf_protoc_lib))
include (GNUInstallDirs)
message (STATUS "using local protobuf build.")
set(protobuf_reqs Protobuf_PROTOC_EXECUTABLE protobuf_protoc_lib)
foreach(lib ${protobuf_reqs})
if(NOT ${lib})
message(STATUS "Couldn't find ${lib}")
endif()
endforeach()
if (WIN32)
# protobuf prepends lib even on windows
set (pbuf_lib_pre "lib")
else ()
set (pbuf_lib_pre ${ep_lib_prefix})
endif ()
# for the external project build of protobuf, we currently ignore the
# static option and always build static libs here. This is consistent
# with our other EP builds. Dynamic libs in an EP would add complexity
# because we'd need to get them into the runtime path, and probably
# install them.
ExternalProject_Add (protobuf_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/protocolbuffers/protobuf.git
GIT_TAG v3.8.0
SOURCE_SUBDIR cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-Dprotobuf_BUILD_TESTS=OFF
-Dprotobuf_BUILD_EXAMPLES=OFF
-Dprotobuf_BUILD_PROTOC_BINARIES=ON
-Dprotobuf_MSVC_STATIC_RUNTIME=ON
-DBUILD_SHARED_LIBS=OFF
-Dprotobuf_BUILD_SHARED_LIBS=OFF
-DCMAKE_DEBUG_POSTFIX=_d
-Dprotobuf_DEBUG_POSTFIX=_d
-Dprotobuf_WITH_ZLIB=$<IF:$<BOOL:${has_zlib}>,ON,OFF>
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
<BINARY_DIR>/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
<BINARY_DIR>/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}
)
ExternalProject_Get_Property (protobuf_src BINARY_DIR)
ExternalProject_Get_Property (protobuf_src SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (protobuf_src)
endif ()
exclude_if_included (protobuf_src)
file(MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
set(ccbd ${CMAKE_CURRENT_BINARY_DIR})
set(CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS src/ripple/proto/ripple.proto)
set(CMAKE_CURRENT_BINARY_DIR ${ccbd})
if (NOT TARGET protobuf::libprotobuf)
add_library (protobuf::libprotobuf STATIC IMPORTED GLOBAL)
endif ()
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (protobuf::libprotobuf PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protobuf${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotobuf protobuf_src)
exclude_if_included (protobuf::libprotobuf)
if (NOT TARGET protobuf::libprotoc)
add_library (protobuf::libprotoc STATIC IMPORTED GLOBAL)
endif ()
set_target_properties (protobuf::libprotoc PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/${CMAKE_INSTALL_LIBDIR}/${pbuf_lib_pre}protoc${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (protobuf::libprotoc protobuf_src)
exclude_if_included (protobuf::libprotoc)
if (NOT TARGET protobuf::protoc)
add_executable (protobuf::protoc IMPORTED)
exclude_if_included (protobuf::protoc)
endif ()
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${BINARY_DIR}/_installed_/bin/protoc${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (protobuf::protoc protobuf_src)
else ()
if (NOT TARGET protobuf::protoc)
if (EXISTS "${Protobuf_PROTOC_EXECUTABLE}")
add_executable (protobuf::protoc IMPORTED)
set_target_properties (protobuf::protoc PROPERTIES
IMPORTED_LOCATION "${Protobuf_PROTOC_EXECUTABLE}")
else ()
message (FATAL_ERROR "Protobuf import failed")
endif ()
endif ()
endif ()
file (MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
set (save_CBD ${CMAKE_CURRENT_BINARY_DIR})
set (CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
protobuf_generate_cpp (
PROTO_SRCS
PROTO_HDRS
src/ripple/proto/ripple.proto)
set (CMAKE_CURRENT_BINARY_DIR ${save_CBD})
add_library (pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
target_include_directories (pbufs PRIVATE src)
target_include_directories (pbufs
SYSTEM PUBLIC ${CMAKE_BINARY_DIR}/proto_gen)
target_link_libraries (pbufs protobuf::libprotobuf)
target_compile_options (pbufs
add_library(pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
target_include_directories(pbufs SYSTEM PUBLIC
${CMAKE_BINARY_DIR}/proto_gen
${CMAKE_BINARY_DIR}/proto_gen/src/ripple/proto
)
target_link_libraries(pbufs protobuf::libprotobuf)
target_compile_options(pbufs
PUBLIC
$<$<BOOL:${is_xcode}>:
$<$<BOOL:${XCODE}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
add_library (Ripple::pbufs ALIAS pbufs)
target_link_libraries (ripple_libs INTERFACE Ripple::pbufs)
exclude_if_included (pbufs)
>
)
add_library(Ripple::pbufs ALIAS pbufs)

View File

@@ -1,177 +0,0 @@
#[===================================================================[
NIH dep: rocksdb
#]===================================================================]
add_library (rocksdb_lib UNKNOWN IMPORTED GLOBAL)
set_target_properties (rocksdb_lib
PROPERTIES INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1)
option (local_rocksdb "use local build of rocksdb." OFF)
if (NOT local_rocksdb)
find_package (RocksDB 6.27 QUIET CONFIG)
if (TARGET RocksDB::rocksdb)
message (STATUS "Found RocksDB using config.")
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_DEBUG)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION_RELEASE)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${_rockslib_l})
endif ()
get_target_property (_rockslib_l RocksDB::rocksdb IMPORTED_LOCATION)
if (_rockslib_l)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${_rockslib_l})
endif ()
get_target_property (_rockslib_i RocksDB::rocksdb INTERFACE_INCLUDE_DIRECTORIES)
if (_rockslib_i)
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${_rockslib_i})
endif ()
target_link_libraries (ripple_libs INTERFACE RocksDB::rocksdb)
else ()
# using a find module with rocksdb is difficult because
# you have no idea how it was configured (transitive dependencies).
# the code below will generally find rocksdb using the module, but
# will then result in linker errors for static linkage since the
# transitive dependencies are unknown. force local build here for now, but leave the code as
# a placeholder for future investigation.
if (static)
set (local_rocksdb ON CACHE BOOL "" FORCE)
# TBD if there is some way to extract transitive deps..then:
#set (RocksDB_USE_STATIC ON)
else ()
find_package (RocksDB 6.27 MODULE)
if (ROCKSDB_FOUND)
if (RocksDB_LIBRARY_DEBUG)
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_DEBUG ${RocksDB_LIBRARY_DEBUG})
endif ()
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION_RELEASE ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES IMPORTED_LOCATION ${RocksDB_LIBRARIES})
set_target_properties (rocksdb_lib PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${RocksDB_INCLUDE_DIRS})
else ()
set (local_rocksdb ON CACHE BOOL "" FORCE)
endif ()
endif ()
endif ()
endif ()
if (local_rocksdb)
message (STATUS "Using local build of RocksDB.")
ExternalProject_Add (rocksdb
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/facebook/rocksdb.git
GIT_TAG v6.27.3
PATCH_COMMAND
# only used by windows build
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/rocks_thirdparty.inc
<SOURCE_DIR>/thirdparty.inc
COMMAND
# fixup their build version file to keep the values
# from changing always
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/rocksdb_build_version.cc.in
<SOURCE_DIR>/util/build_version.cc.in
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DWITH_JEMALLOC=$<IF:$<BOOL:${jemalloc}>,ON,OFF>
-DWITH_SNAPPY=ON
-DWITH_LZ4=ON
-DWITH_ZLIB=OFF
-DUSE_RTTI=ON
-DWITH_ZSTD=OFF
-DWITH_GFLAGS=OFF
-DWITH_BZ2=OFF
-ULZ4_*
-Ulz4_*
-Dlz4_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:lz4_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dlz4_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:lz4_lib,IMPORTED_LOCATION_RELEASE>>
-Dlz4_FOUND=ON
-USNAPPY_*
-Usnappy_*
-USnappy_*
-Dsnappy_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:snappy_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-Dsnappy_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_RELEASE>>
-Dsnappy_FOUND=ON
-DSnappy_INCLUDE_DIRS=$<JOIN:$<TARGET_PROPERTY:snappy_lib,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DSnappy_LIBRARIES=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:snappy_lib,IMPORTED_LOCATION_RELEASE>>
-DSnappy_FOUND=ON
-DWITH_MD_LIBRARY=OFF
-DWITH_RUNTIME_DEBUG=$<IF:$<CONFIG:Debug>,ON,OFF>
-DFAIL_ON_WARNINGS=OFF
-DWITH_ASAN=OFF
-DWITH_TSAN=OFF
-DWITH_UBSAN=OFF
-DWITH_NUMA=OFF
-DWITH_TBB=OFF
-DWITH_WINDOWS_UTF8_FILENAMES=OFF
-DWITH_XPRESS=OFF
-DPORTABLE=ON
-DFORCE_SSE42=OFF
-DDISABLE_STALL_NOTIF=OFF
-DOPTDBG=ON
-DROCKSDB_LITE=OFF
-DWITH_FALLOCATE=ON
-DWITH_LIBRADOS=OFF
-DWITH_JNI=OFF
-DROCKSDB_INSTALL_ON_WINDOWS=OFF
-DWITH_TESTS=OFF
-DWITH_TOOLS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -MP /DNDEBUG"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-DNDEBUG"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}rocksdb$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
LIST_SEPARATOR ::
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS snappy_lib lz4_lib
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}rocksdb${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
)
ExternalProject_Get_Property (rocksdb BINARY_DIR)
ExternalProject_Get_Property (rocksdb SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (rocksdb)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
set_target_properties (rocksdb_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}rocksdb_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}rocksdb${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies (rocksdb_lib rocksdb)
exclude_if_included (rocksdb)
endif ()
target_link_libraries (rocksdb_lib
INTERFACE
snappy_lib
lz4_lib
$<$<BOOL:${MSVC}>:rpcrt4>)
exclude_if_included (rocksdb_lib)
target_link_libraries (ripple_libs INTERFACE rocksdb_lib)

View File

@@ -1,58 +0,0 @@
#[===================================================================[
NIH dep: secp256k1
#]===================================================================]
add_library (secp256k1_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(secp256k1)
endif()
if(secp256k1)
set_target_properties (secp256k1_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${secp256k1}
IMPORTED_LOCATION_RELEASE
${secp256k1}
INTERFACE_INCLUDE_DIRECTORIES
${SECP256K1_INCLUDE_DIR})
add_library (secp256k1 ALIAS secp256k1_lib)
add_library (NIH::secp256k1 ALIAS secp256k1_lib)
else()
set(INSTALL_SECP256K1 true)
add_library (secp256k1 STATIC
src/secp256k1/src/secp256k1.c)
target_compile_definitions (secp256k1
PRIVATE
USE_NUM_NONE
USE_FIELD_10X26
USE_FIELD_INV_BUILTIN
USE_SCALAR_8X32
USE_SCALAR_INV_BUILTIN)
target_include_directories (secp256k1
PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
$<INSTALL_INTERFACE:include>
PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src/secp256k1)
target_compile_options (secp256k1
PRIVATE
$<$<BOOL:${MSVC}>:-wd4319>
$<$<NOT:$<BOOL:${MSVC}>>:
-Wno-deprecated-declarations
-Wno-unused-function
>
$<$<BOOL:${is_gcc}>:-Wno-nonnull-compare>)
target_link_libraries (ripple_libs INTERFACE NIH::secp256k1)
#[===========================[
headers installation
#]===========================]
install (
FILES
src/secp256k1/include/secp256k1.h
DESTINATION include/secp256k1/include)
add_library (NIH::secp256k1 ALIAS secp256k1)
endif()

View File

@@ -1,77 +0,0 @@
#[===================================================================[
NIH dep: snappy
#]===================================================================]
add_library (snappy_lib STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(snappy)
endif()
if(snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${snappy}
IMPORTED_LOCATION_RELEASE
${snappy}
INTERFACE_INCLUDE_DIRECTORIES
${SNAPPY_INCLUDE_DIR})
else()
ExternalProject_Add (snappy
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/google/snappy.git
GIT_TAG 1.1.7
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DBUILD_SHARED_LIBS=OFF
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
-DSNAPPY_BUILD_TESTS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}snappy$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E copy_if_different <BINARY_DIR>/config.h <BINARY_DIR>/snappy-stubs-public.h <SOURCE_DIR>
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}snappy${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}snappy_d${ep_lib_suffix}
)
ExternalProject_Get_Property (snappy BINARY_DIR)
ExternalProject_Get_Property (snappy SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (snappy)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/snappy)
set_target_properties (snappy_lib PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}snappy_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}snappy${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
endif()
add_dependencies (snappy_lib snappy)
target_link_libraries (ripple_libs INTERFACE snappy_lib)
exclude_if_included (snappy)
exclude_if_included (snappy_lib)

View File

@@ -1,165 +0,0 @@
#[===================================================================[
NIH dep: soci
#]===================================================================]
foreach (_comp core empty sqlite3)
add_library ("soci_${_comp}" STATIC IMPORTED GLOBAL)
endforeach ()
if (NOT WIN32)
find_package(soci)
endif()
if (soci)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${soci}
IMPORTED_LOCATION_RELEASE
${soci}
INTERFACE_INCLUDE_DIRECTORIES
${SOCI_INCLUDE_DIR})
endforeach ()
else()
set (soci_lib_pre ${ep_lib_prefix})
set (soci_lib_post "")
if (WIN32)
# for some reason soci on windows still prepends lib (non-standard)
set (soci_lib_pre lib)
# this version in the name might change if/when we change versions of soci
set (soci_lib_post "_4_0")
endif ()
get_target_property (_boost_incs Boost::date_time INTERFACE_INCLUDE_DIRECTORIES)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION)
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _boost_dt)
get_target_property (_boost_dt Boost::date_time IMPORTED_LOCATION_DEBUG)
endif ()
ExternalProject_Add (soci
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/SOCI/soci.git
GIT_TAG 04e1870294918d20761736743bb6136314c42dd5
# We had an issue with soci integer range checking for boost::optional
# and needed to remove the exception that SOCI throws in this case.
# This is *probably* a bug in SOCI, but has never been investigated more
# nor reported to the maintainers.
# This cmake script comments out the lines in question.
# This patch process is likely fragile and should be reviewed carefully
# whenever we update the GIT_TAG above.
PATCH_COMMAND
${CMAKE_COMMAND} -D RIPPLED_SOURCE=${CMAKE_CURRENT_SOURCE_DIR}
-P ${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/soci_patch.cmake
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_PREFIX_PATH=${CMAKE_BINARY_DIR}/sqlite3
-DCMAKE_MODULE_PATH=${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake
-DCMAKE_INCLUDE_PATH=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DCMAKE_LIBRARY_PATH=${sqlite_BINARY_DIR}
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DSOCI_CXX_C11=ON
-DSOCI_STATIC=ON
-DSOCI_LIBDIR=lib
-DSOCI_SHARED=OFF
-DSOCI_TESTS=OFF
# hacks to workaround the fact that soci doesn't currently use
# boost imported targets in its cmake. If they switch to
# proper imported targets, this next line can be removed
# (as well as the get_property above that sets _boost_incs)
-DBoost_INCLUDE_DIRS=$<JOIN:${_boost_incs},::>
-DBoost_INCLUDE_DIR=$<JOIN:${_boost_incs},::>
-DBOOST_ROOT=${BOOST_ROOT}
-DWITH_BOOST=ON
-DBoost_FOUND=ON
-DBoost_NO_BOOST_CMAKE=ON
-DBoost_DATE_TIME_FOUND=ON
-DSOCI_HAVE_BOOST=ON
-DSOCI_HAVE_BOOST_DATE_TIME=ON
-DBoost_DATE_TIME_LIBRARY=${_boost_dt}
-DSOCI_DB2=OFF
-DSOCI_FIREBIRD=OFF
-DSOCI_MYSQL=OFF
-DSOCI_ODBC=OFF
-DSOCI_ORACLE=OFF
-DSOCI_POSTGRESQL=OFF
-DSOCI_SQLITE3=ON
-DSQLITE3_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
-DSQLITE3_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:sqlite,IMPORTED_LOCATION_RELEASE>>
$<$<BOOL:${APPLE}>:-DCMAKE_FIND_FRAMEWORK=LAST>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_CXX_FLAGS_DEBUG=-MTd"
"-DCMAKE_CXX_FLAGS_RELEASE=-MT"
>
$<$<NOT:$<BOOL:${MSVC}>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations"
>
# SEE: https://github.com/SOCI/soci/issues/640
$<$<AND:$<BOOL:${is_gcc}>,$<VERSION_GREATER_EQUAL:${CMAKE_CXX_COMPILER_VERSION},8>>:
"-DCMAKE_CXX_FLAGS=-Wno-deprecated-declarations -Wno-error=format-overflow -Wno-format-overflow -Wno-error=format-truncation"
>
LIST_SEPARATOR ::
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_core${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_empty${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib/$<CONFIG>/${soci_lib_pre}soci_sqlite3${soci_lib_post}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/lib
>
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS sqlite
BUILD_BYPRODUCTS
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_core${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_empty${soci_lib_post}_d${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}${ep_lib_suffix}
<BINARY_DIR>/lib/${soci_lib_pre}soci_sqlite3${soci_lib_post}_d${ep_lib_suffix}
)
ExternalProject_Get_Property (soci BINARY_DIR)
ExternalProject_Get_Property (soci SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (soci)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
file (MAKE_DIRECTORY ${BINARY_DIR}/include)
foreach (_comp core empty sqlite3)
set_target_properties ("soci_${_comp}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/lib/${soci_lib_pre}soci_${_comp}${soci_lib_post}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
"${SOURCE_DIR}/include;${BINARY_DIR}/include")
add_dependencies ("soci_${_comp}" soci) # something has to depend on the ExternalProject to trigger it
target_link_libraries (ripple_libs INTERFACE "soci_${_comp}")
if (NOT _comp STREQUAL "core")
target_link_libraries ("soci_${_comp}" INTERFACE soci_core)
endif ()
endforeach ()
endif()
foreach (_comp core empty sqlite3)
exclude_if_included ("soci_${_comp}")
endforeach ()
exclude_if_included (soci)

View File

@@ -1,93 +0,0 @@
#[===================================================================[
NIH dep: sqlite
#]===================================================================]
add_library (sqlite STATIC IMPORTED GLOBAL)
if (NOT WIN32)
find_package(sqlite)
endif()
if(sqlite3)
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${sqlite3}
IMPORTED_LOCATION_RELEASE
${sqlite3}
INTERFACE_INCLUDE_DIRECTORIES
${SQLITE_INCLUDE_DIR})
else()
ExternalProject_Add (sqlite3
PREFIX ${nih_cache_path}
# sqlite doesn't use git, but it provides versioned tarballs
URL https://www.sqlite.org/2018/sqlite-amalgamation-3260000.zip
http://www.sqlite.org/2018/sqlite-amalgamation-3260000.zip
https://www2.sqlite.org/2018/sqlite-amalgamation-3260000.zip
http://www2.sqlite.org/2018/sqlite-amalgamation-3260000.zip
# ^^^ version is apparent in the URL: 3260000 => 3.26.0
URL_HASH SHA256=de5dcab133aa339a4cf9e97c40aa6062570086d6085d8f9ad7bc6ddf8a52096e
# Don't need to worry about MITM attacks too much because the download
# is checked against a strong hash
TLS_VERIFY false
# we wrote a very simple CMake file to build sqlite
# so that's what we copy here so that we can build with
# CMake. sqlite doesn't generally provided a build system
# for the single amalgamation source file.
PATCH_COMMAND
${CMAKE_COMMAND} -E copy_if_different
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/CMake_sqlite3.txt
<SOURCE_DIR>/CMakeLists.txt
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}sqlite3$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>
>
TEST_COMMAND ""
INSTALL_COMMAND ""
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}sqlite3${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
)
ExternalProject_Get_Property (sqlite3 BINARY_DIR)
ExternalProject_Get_Property (sqlite3 SOURCE_DIR)
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (sqlite3)
endif ()
set_target_properties (sqlite PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/${ep_lib_prefix}sqlite3_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/${ep_lib_prefix}sqlite3${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR})
add_dependencies (sqlite sqlite3)
exclude_if_included (sqlite3)
endif()
target_link_libraries (sqlite INTERFACE $<$<NOT:$<BOOL:${MSVC}>>:dl>)
target_link_libraries (ripple_libs INTERFACE sqlite)
exclude_if_included (sqlite)
set(sqlite_BINARY_DIR ${BINARY_DIR})

View File

@@ -81,4 +81,4 @@ if(XAR_LIBRARY)
else()
message(WARNING "xar library not found... (only important for mac builds)")
endif()
add_library (wasmedge::wasmedge ALIAS wasmedge)
add_library (NIH::WasmEdge ALIAS wasmedge)

View File

@@ -1,167 +0,0 @@
if(reporting)
find_library(cassandra NAMES cassandra)
if(NOT cassandra)
message("System installed Cassandra cpp driver not found. Will build")
find_library(zlib NAMES zlib1g-dev zlib-devel zlib z)
if(NOT zlib)
message("zlib not found. will build")
add_library(zlib STATIC IMPORTED GLOBAL)
ExternalProject_Add(zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.12
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}z.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (zlib_src SOURCE_DIR)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${zlib_src_SOURCE_DIR}/include)
set_target_properties (zlib PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}z.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(zlib zlib_src)
file(TO_CMAKE_PATH "${zlib_src_SOURCE_DIR}" zlib_src_SOURCE_DIR)
endif()
find_library(krb5 NAMES krb5-dev libkrb5-dev)
if(NOT krb5)
message("krb5 not found. will build")
add_library(krb5 STATIC IMPORTED GLOBAL)
ExternalProject_Add(krb5_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/krb5/krb5.git
GIT_TAG krb5-1.20-final
UPDATE_COMMAND ""
CONFIGURE_COMMAND autoreconf src && CFLAGS=-fcommon ./src/configure --enable-static --disable-shared > /dev/null
BUILD_IN_SOURCE 1
BUILD_COMMAND make
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <SOURCE_DIR>/lib/${ep_lib_prefix}krb5.a
LOG_BUILD TRUE
)
ExternalProject_Get_Property (krb5_src SOURCE_DIR)
ExternalProject_Get_Property (krb5_src BINARY_DIR)
set (krb5_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${krb5_src_SOURCE_DIR}/include)
set_target_properties (krb5 PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/lib/${ep_lib_prefix}krb5.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(krb5 krb5_src)
file(TO_CMAKE_PATH "${krb5_src_SOURCE_DIR}" krb5_src_SOURCE_DIR)
endif()
find_library(libuv1 NAMES uv1 libuv1 liubuv1-dev libuv1:amd64)
if(NOT libuv1)
message("libuv1 not found, will build")
add_library(libuv1 STATIC IMPORTED GLOBAL)
ExternalProject_Add(libuv_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/libuv/libuv.git
GIT_TAG v1.44.2
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}uv_a.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (libuv_src SOURCE_DIR)
ExternalProject_Get_Property (libuv_src BINARY_DIR)
set (libuv_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${libuv_src_SOURCE_DIR}/include)
set_target_properties (libuv1 PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}uv_a.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(libuv1 libuv_src)
file(TO_CMAKE_PATH "${libuv_src_SOURCE_DIR}" libuv_src_SOURCE_DIR)
endif()
add_library (cassandra STATIC IMPORTED GLOBAL)
ExternalProject_Add(cassandra_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/datastax/cpp-driver.git
GIT_TAG 2.16.2
CMAKE_ARGS
-DLIBUV_ROOT_DIR=${BINARY_DIR}
-DLIBUV_LIBARY=${BINARY_DIR}/libuv_a.a
-DLIBUV_INCLUDE_DIR=${SOURCE_DIR}/include
-DCASS_BUILD_STATIC=ON
-DCASS_BUILD_SHARED=OFF
-DOPENSSL_ROOT_DIR=/opt/local/openssl
INSTALL_COMMAND ""
BUILD_BYPRODUCTS <BINARY_DIR>/${ep_lib_prefix}cassandra_static.a
LOG_BUILD TRUE
LOG_CONFIGURE TRUE
)
ExternalProject_Get_Property (cassandra_src SOURCE_DIR)
ExternalProject_Get_Property (cassandra_src BINARY_DIR)
set (cassandra_src_SOURCE_DIR "${SOURCE_DIR}")
file (MAKE_DIRECTORY ${cassandra_src_SOURCE_DIR}/include)
set_target_properties (cassandra PROPERTIES
IMPORTED_LOCATION
${BINARY_DIR}/${ep_lib_prefix}cassandra_static.a
INTERFACE_INCLUDE_DIRECTORIES
${SOURCE_DIR}/include)
add_dependencies(cassandra cassandra_src)
if(NOT libuv1)
ExternalProject_Add_StepDependencies(cassandra_src build libuv1)
target_link_libraries(cassandra INTERFACE libuv1)
else()
target_link_libraries(cassandra INTERFACE ${libuv1})
endif()
if(NOT krb5)
ExternalProject_Add_StepDependencies(cassandra_src build krb5)
target_link_libraries(cassandra INTERFACE krb5)
else()
target_link_libraries(cassandra INTERFACE ${krb5})
endif()
if(NOT zlib)
ExternalProject_Add_StepDependencies(cassandra_src build zlib)
target_link_libraries(cassandra INTERFACE zlib)
else()
target_link_libraries(cassandra INTERFACE ${zlib})
endif()
file(TO_CMAKE_PATH "${cassandra_src_SOURCE_DIR}" cassandra_src_SOURCE_DIR)
target_link_libraries(ripple_libs INTERFACE cassandra)
else()
message("Found system installed cassandra cpp driver")
find_path(cassandra_includes NAMES cassandra.h REQUIRED)
target_link_libraries (ripple_libs INTERFACE ${cassandra})
target_include_directories(ripple_libs INTERFACE ${cassandra_includes})
endif()
exclude_if_included (cassandra)
endif()

View File

@@ -1,18 +0,0 @@
#[===================================================================[
NIH dep: date
the main library is header-only, thus is an INTERFACE lib in CMake.
NOTE: this has been accepted into c++20 so can likely be replaced
when we update to that standard
#]===================================================================]
find_package (date QUIET)
if (NOT TARGET date::date)
FetchContent_Declare(
hh_date_src
GIT_REPOSITORY https://github.com/HowardHinnant/date.git
GIT_TAG fc4cf092f9674f2670fb9177edcdee870399b829
)
FetchContent_MakeAvailable(hh_date_src)
endif ()

View File

@@ -1,324 +1,15 @@
# currently linking to unsecure versions...if we switch, we'll
# need to add ssl as a link dependency to the grpc targets
option (use_secure_grpc "use TLS version of grpc libs." OFF)
if (use_secure_grpc)
set (grpc_suffix "")
else ()
set (grpc_suffix "_unsecure")
endif ()
find_package (gRPC 1.23 CONFIG QUIET)
if (TARGET gRPC::gpr AND NOT local_grpc)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_DEBUG)
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION_RELEASE)
endif ()
if (NOT _grpc_l)
get_target_property (_grpc_l gRPC::gpr IMPORTED_LOCATION)
endif ()
message (STATUS "Found cmake config for gRPC. Using ${_grpc_l}.")
else ()
find_package (PkgConfig QUIET)
if (PKG_CONFIG_FOUND)
pkg_check_modules (grpc QUIET "grpc${grpc_suffix}>=1.25" "grpc++${grpc_suffix}" gpr)
endif ()
if (grpc_FOUND)
message (STATUS "Found gRPC using pkg-config. Using ${grpc_gpr_PREFIX}.")
endif ()
add_executable (gRPC::grpc_cpp_plugin IMPORTED)
exclude_if_included (gRPC::grpc_cpp_plugin)
if (grpc_FOUND AND NOT local_grpc)
# use installed grpc (via pkg-config)
macro (add_imported_grpc libname_)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}${libname_}${CMAKE_STATIC_LIBRARY_SUFFIX}")
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}${libname_}${CMAKE_SHARED_LIBRARY_SUFFIX}")
endif()
find_library(_found_${libname_}
NAMES ${_search}
HINTS ${grpc_LIBRARY_DIRS})
if (_found_${libname_})
message (STATUS "importing ${libname_} as ${_found_${libname_}}")
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find ${_search}")
endif ()
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES IMPORTED_LOCATION ${_found_${libname_}})
if (grpc_INCLUDE_DIRS)
set_target_properties ("gRPC::${libname_}" PROPERTIES INTERFACE_INCLUDE_DIRECTORIES ${grpc_INCLUDE_DIRS})
endif ()
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_gpr_PREFIX}/bin/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
pkg_check_modules (cares QUIET libcares)
if (cares_FOUND)
if (static)
set (_search "${CMAKE_STATIC_LIBRARY_PREFIX}cares${CMAKE_STATIC_LIBRARY_SUFFIX}")
set (_prefix cares_STATIC)
set (_static STATIC)
else ()
set (_search "${CMAKE_SHARED_LIBRARY_PREFIX}cares${CMAKE_SHARED_LIBRARY_SUFFIX}")
set (_prefix cares)
set (_static)
endif()
find_library(_location NAMES ${_search} HINTS ${cares_LIBRARY_DIRS})
if (NOT _location)
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
if(${_location} MATCHES "\\.a$")
add_library(c-ares::cares STATIC IMPORTED GLOBAL)
else()
add_library(c-ares::cares SHARED IMPORTED GLOBAL)
endif()
set_target_properties (c-ares::cares PROPERTIES
IMPORTED_LOCATION ${_location}
INTERFACE_INCLUDE_DIRECTORIES "${${_prefix}_INCLUDE_DIRS}"
INTERFACE_LINK_OPTIONS "${${_prefix}_LDFLAGS}"
)
exclude_if_included (c-ares::cares)
else ()
message (FATAL_ERROR "using pkg-config for grpc, can't find c-ares")
endif ()
else ()
#[===========================[
c-ares (grpc requires)
#]===========================]
ExternalProject_Add (c-ares_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/c-ares/c-ares.git
GIT_TAG cares-1_15_0
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DCARES_SHARED=OFF
-DCARES_STATIC=ON
-DCARES_STATIC_PIC=ON
-DCARES_INSTALL=ON
-DCARES_MSVC_STATIC_RUNTIME=ON
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
)
exclude_if_included (c-ares_src)
ExternalProject_Get_Property (c-ares_src BINARY_DIR)
set (cares_binary_dir "${BINARY_DIR}")
add_library (c-ares::cares STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (c-ares::cares PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}cares${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (c-ares::cares c-ares_src)
exclude_if_included (c-ares::cares)
if (NOT has_zlib)
#[===========================[
zlib (grpc requires)
#]===========================]
if (MSVC)
set (zlib_debug_postfix "d") # zlib cmake sets this internally for MSVC, so we really don't have a choice
set (zlib_base "zlibstatic")
else ()
set (zlib_debug_postfix "_d")
set (zlib_base "z")
endif ()
ExternalProject_Add (zlib_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/madler/zlib.git
GIT_TAG v1.2.11
CMAKE_ARGS
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
-DCMAKE_DEBUG_POSTFIX=${zlib_debug_postfix}
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DCMAKE_INSTALL_PREFIX=<BINARY_DIR>/_installed_
-DBUILD_SHARED_LIBS=OFF
$<$<BOOL:${MSVC}>:
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
"-DCMAKE_C_FLAGS_DEBUG=-MTd"
"-DCMAKE_C_FLAGS_RELEASE=-MT"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
TEST_COMMAND ""
INSTALL_COMMAND
${CMAKE_COMMAND} -E env --unset=DESTDIR ${CMAKE_COMMAND} --build . --config $<CONFIG> --target install
BUILD_BYPRODUCTS
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
<BINARY_DIR>/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
)
exclude_if_included (zlib_src)
ExternalProject_Get_Property (zlib_src BINARY_DIR)
set (zlib_binary_dir "${BINARY_DIR}")
add_library (ZLIB::ZLIB STATIC IMPORTED GLOBAL)
file (MAKE_DIRECTORY ${BINARY_DIR}/_installed_/include)
set_target_properties (ZLIB::ZLIB PROPERTIES
IMPORTED_LOCATION_DEBUG
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${zlib_debug_postfix}${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${BINARY_DIR}/_installed_/lib/${ep_lib_prefix}${zlib_base}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${BINARY_DIR}/_installed_/include)
add_dependencies (ZLIB::ZLIB zlib_src)
exclude_if_included (ZLIB::ZLIB)
endif ()
#[===========================[
grpc
#]===========================]
ExternalProject_Add (grpc_src
PREFIX ${nih_cache_path}
GIT_REPOSITORY https://github.com/grpc/grpc.git
GIT_TAG v1.25.0
CMAKE_ARGS
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
-DCMAKE_CXX_STANDARD=17
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:-DCMAKE_VERBOSE_MAKEFILE=ON>
$<$<BOOL:${CMAKE_TOOLCHAIN_FILE}>:-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}>
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
-DCMAKE_DEBUG_POSTFIX=_d
$<$<NOT:$<BOOL:${is_multiconfig}>>:-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}>
-DgRPC_BUILD_TESTS=OFF
-DgRPC_BENCHMARK_PROVIDER=""
-DgRPC_BUILD_CSHARP_EXT=OFF
-DgRPC_MSVC_STATIC_RUNTIME=ON
-DgRPC_INSTALL=OFF
-DgRPC_CARES_PROVIDER=package
-Dc-ares_DIR=${cares_binary_dir}/_installed_/lib/cmake/c-ares
-DgRPC_SSL_PROVIDER=package
-DOPENSSL_ROOT_DIR=${OPENSSL_ROOT_DIR}
-DgRPC_PROTOBUF_PROVIDER=package
-DProtobuf_USE_STATIC_LIBS=$<IF:$<AND:$<BOOL:${Protobuf_FOUND}>,$<NOT:$<BOOL:${static}>>>,OFF,ON>
-DProtobuf_INCLUDE_DIR=$<JOIN:$<TARGET_PROPERTY:protobuf::libprotobuf,INTERFACE_INCLUDE_DIRECTORIES>,:_:>
-DProtobuf_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotobuf,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_LIBRARY=$<IF:$<CONFIG:Debug>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_DEBUG>,$<TARGET_PROPERTY:protobuf::libprotoc,IMPORTED_LOCATION_RELEASE>>
-DProtobuf_PROTOC_EXECUTABLE=$<TARGET_PROPERTY:protobuf::protoc,IMPORTED_LOCATION>
-DgRPC_ZLIB_PROVIDER=package
$<$<NOT:$<BOOL:${has_zlib}>>:-DZLIB_ROOT=${zlib_binary_dir}/_installed_>
$<$<BOOL:${MSVC}>:
"-DCMAKE_CXX_FLAGS=-GR -Gd -fp:precise -FS -EHa -MP"
"-DCMAKE_C_FLAGS=-GR -Gd -fp:precise -FS -MP"
>
LOG_BUILD ON
LOG_CONFIGURE ON
BUILD_COMMAND
${CMAKE_COMMAND}
--build .
--config $<CONFIG>
--parallel ${ep_procs}
$<$<BOOL:${is_multiconfig}>:
COMMAND
${CMAKE_COMMAND} -E copy
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}grpc++${grpc_suffix}$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}address_sorting$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/${ep_lib_prefix}gpr$<$<CONFIG:Debug>:_d>${ep_lib_suffix}
<BINARY_DIR>/$<CONFIG>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
<BINARY_DIR>
>
LIST_SEPARATOR :_:
TEST_COMMAND ""
INSTALL_COMMAND ""
DEPENDS c-ares_src
BUILD_BYPRODUCTS
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}grpc++${grpc_suffix}_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}address_sorting_d${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr${ep_lib_suffix}
<BINARY_DIR>/${ep_lib_prefix}gpr_d${ep_lib_suffix}
<BINARY_DIR>/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}
)
if (TARGET protobuf_src)
ExternalProject_Add_StepDependencies(grpc_src build protobuf_src)
endif ()
exclude_if_included (grpc_src)
ExternalProject_Get_Property (grpc_src BINARY_DIR)
ExternalProject_Get_Property (grpc_src SOURCE_DIR)
set (grpc_binary_dir "${BINARY_DIR}")
set (grpc_source_dir "${SOURCE_DIR}")
if (CMAKE_VERBOSE_MAKEFILE)
print_ep_logs (grpc_src)
endif ()
file (MAKE_DIRECTORY ${SOURCE_DIR}/include)
macro (add_imported_grpc libname_)
add_library ("gRPC::${libname_}" STATIC IMPORTED GLOBAL)
set_target_properties ("gRPC::${libname_}" PROPERTIES
IMPORTED_LOCATION_DEBUG
${grpc_binary_dir}/${ep_lib_prefix}${libname_}_d${ep_lib_suffix}
IMPORTED_LOCATION_RELEASE
${grpc_binary_dir}/${ep_lib_prefix}${libname_}${ep_lib_suffix}
INTERFACE_INCLUDE_DIRECTORIES
${grpc_source_dir}/include)
add_dependencies ("gRPC::${libname_}" grpc_src)
target_link_libraries (ripple_libs INTERFACE "gRPC::${libname_}")
exclude_if_included ("gRPC::${libname_}")
endmacro ()
set_target_properties (gRPC::grpc_cpp_plugin PROPERTIES
IMPORTED_LOCATION "${grpc_binary_dir}/grpc_cpp_plugin${CMAKE_EXECUTABLE_SUFFIX}")
add_dependencies (gRPC::grpc_cpp_plugin grpc_src)
endif ()
add_imported_grpc (gpr)
add_imported_grpc ("grpc${grpc_suffix}")
add_imported_grpc ("grpc++${grpc_suffix}")
add_imported_grpc (address_sorting)
target_link_libraries ("gRPC::grpc${grpc_suffix}" INTERFACE c-ares::cares gRPC::gpr gRPC::address_sorting ZLIB::ZLIB)
target_link_libraries ("gRPC::grpc++${grpc_suffix}" INTERFACE "gRPC::grpc${grpc_suffix}" gRPC::gpr)
endif ()
find_package(gRPC 1.23)
#[=================================[
generate protobuf sources for
grpc defs and bundle into a
static lib
#]=================================]
set (GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
file (MAKE_DIRECTORY ${GRPC_GEN_DIR})
set (GRPC_PROTO_SRCS)
set (GRPC_PROTO_HDRS)
set (GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
set(GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
file(MAKE_DIRECTORY ${GRPC_GEN_DIR})
set(GRPC_PROTO_SRCS)
set(GRPC_PROTO_HDRS)
set(GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
file(GLOB_RECURSE GRPC_DEFINITION_FILES LIST_DIRECTORIES false "${GRPC_PROTO_ROOT}/*.proto")
foreach(file ${GRPC_DEFINITION_FILES})
get_filename_component(_abs_file ${file} ABSOLUTE)
@@ -329,10 +20,10 @@ foreach(file ${GRPC_DEFINITION_FILES})
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
set (src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
set (src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
set (hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
set (hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
set(src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
set(src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
set(hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
set(hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
add_custom_command(
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
COMMAND protobuf::protoc
@@ -350,20 +41,22 @@ foreach(file ${GRPC_DEFINITION_FILES})
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
endforeach()
add_library (grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
#target_include_directories (grpc_pbufs PRIVATE src)
target_include_directories (grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
target_link_libraries (grpc_pbufs protobuf::libprotobuf "gRPC::grpc++${grpc_suffix}")
target_compile_options (grpc_pbufs
add_library(grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
#target_include_directories(grpc_pbufs PRIVATE src)
target_include_directories(grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
target_link_libraries(grpc_pbufs
"gRPC::grpc++"
# libgrpc is missing references.
absl::random_random
)
target_compile_options(grpc_pbufs
PRIVATE
$<$<BOOL:${MSVC}>:-wd4065>
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
PUBLIC
$<$<BOOL:${MSVC}>:-wd4996>
$<$<BOOL:${is_xcode}>:
$<$<BOOL:${XCODE}>:
--system-header-prefix="google/protobuf"
-Wno-deprecated-dynamic-exception-spec
>)
add_library (Ripple::grpc_pbufs ALIAS grpc_pbufs)
target_link_libraries (ripple_libs INTERFACE Ripple::grpc_pbufs)
exclude_if_included (grpc_pbufs)
add_library(Ripple::grpc_pbufs ALIAS grpc_pbufs)

View File

@@ -1,15 +0,0 @@
set (THIRDPARTY_LIBS "")
if(WITH_SNAPPY)
add_definitions(-DSNAPPY)
include_directories(${snappy_INCLUDE_DIRS})
set (THIRDPARTY_LIBS ${THIRDPARTY_LIBS} ${snappy_LIBRARIES})
endif()
if(WITH_LZ4)
add_definitions(-DLZ4)
include_directories(${lz4_INCLUDE_DIRS})
set (THIRDPARTY_LIBS ${THIRDPARTY_LIBS} ${lz4_LIBRARIES})
endif()

View File

@@ -1,71 +0,0 @@
// Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
#include <memory>
#include "rocksdb/version.h"
#include "util/string_util.h"
// The build script may replace these values with real values based
// on whether or not GIT is available and the platform settings
static const std::string rocksdb_build_git_sha = "rocksdb_build_git_sha:@GIT_SHA@";
static const std::string rocksdb_build_git_tag = "rocksdb_build_git_tag:@GIT_TAG@";
#define HAS_GIT_CHANGES @GIT_MOD@
#if HAS_GIT_CHANGES == 0
// If HAS_GIT_CHANGES is 0, the GIT date is used.
// Use the time the branch/tag was last modified
static const std::string rocksdb_build_date = "rocksdb_build_date:@GIT_DATE@";
#else
// If HAS_GIT_CHANGES is > 0, the branch/tag has modifications.
// Use the time the build was created.
static const std::string rocksdb_build_date = "rocksdb_build_date:@BUILD_DATE@";
#endif
namespace ROCKSDB_NAMESPACE {
static void AddProperty(std::unordered_map<std::string, std::string> *props, const std::string& name) {
size_t colon = name.find(":");
if (colon != std::string::npos && colon > 0 && colon < name.length() - 1) {
// If we found a "@:", then this property was a build-time substitution that failed. Skip it
size_t at = name.find("@", colon);
if (at != colon + 1) {
// Everything before the colon is the name, after is the value
(*props)[name.substr(0, colon)] = name.substr(colon + 1);
}
}
}
static std::unordered_map<std::string, std::string>* LoadPropertiesSet() {
auto * properties = new std::unordered_map<std::string, std::string>();
AddProperty(properties, rocksdb_build_git_sha);
AddProperty(properties, rocksdb_build_git_tag);
AddProperty(properties, rocksdb_build_date);
return properties;
}
const std::unordered_map<std::string, std::string>& GetRocksBuildProperties() {
static std::unique_ptr<std::unordered_map<std::string, std::string>> props(LoadPropertiesSet());
return *props;
}
std::string GetRocksVersionAsString(bool with_patch) {
std::string version = ToString(ROCKSDB_MAJOR) + "." + ToString(ROCKSDB_MINOR);
if (with_patch) {
return version + "." + ToString(ROCKSDB_PATCH);
} else {
return version;
}
}
std::string GetRocksBuildInfoAsString(const std::string& program, bool verbose) {
std::string info = program + " (RocksDB) " + GetRocksVersionAsString(true);
if (verbose) {
for (const auto& it : GetRocksBuildProperties()) {
info.append("\n ");
info.append(it.first);
info.append(": ");
info.append(it.second);
}
}
return info;
}
} // namespace ROCKSDB_NAMESPACE

View File

@@ -1,49 +0,0 @@
# This patches unsigned-types.h in the soci official sources
# so as to remove type range check exceptions that cause
# us trouble when using boost::optional to select int values
# Soci's CMake setup leaves flags in place that will cause warnings to
# be treated as errors, but some compiler versions throw "new" warnings
# that then cause the build to fail. Simplify that until soci fixes
# those warnings.
if (RIPPLED_SOURCE)
execute_process( COMMAND ${CMAKE_COMMAND} -E copy_if_different
${RIPPLED_SOURCE}/Builds/CMake/SociConfig.cmake.patched
cmake/SociConfig.cmake )
endif ()
# Some versions of CMake erroneously patch external projects on every build.
# If the patch makes no changes, skip it. This workaround can be
# removed once we stop supporting vulnerable versions of CMake.
# https://gitlab.kitware.com/cmake/cmake/-/issues/21086
file (STRINGS include/soci/unsigned-types.h sourcecode)
# Delete the .patched file if it exists, so it doesn't end up duplicated.
# Trying to remove a file that does not exist is not a problem.
file (REMOVE include/soci/unsigned-types.h.patched)
foreach (line_ ${sourcecode})
if (line_ MATCHES "^[ \\t]+throw[ ]+soci_error[ ]*\\([ ]*\"Value outside of allowed.+$")
set (line_ "//${CMAKE_MATCH_0}")
endif ()
file (APPEND include/soci/unsigned-types.h.patched "${line_}\n")
endforeach ()
execute_process( COMMAND ${CMAKE_COMMAND} -E compare_files
include/soci/unsigned-types.h include/soci/unsigned-types.h.patched
RESULT_VARIABLE compare_result
)
if( compare_result EQUAL 0)
message(DEBUG "The soci source and patch files are identical. Make no changes.")
file (REMOVE include/soci/unsigned-types.h.patched)
return()
endif()
file (RENAME include/soci/unsigned-types.h include/soci/unsigned-types.h.orig)
file (RENAME include/soci/unsigned-types.h.patched include/soci/unsigned-types.h)
# also fix Boost.cmake so that it just returns when we override the Boost_FOUND var
file (APPEND cmake/dependencies/Boost.cmake.patched "if (Boost_FOUND)\n")
file (APPEND cmake/dependencies/Boost.cmake.patched " return ()\n")
file (APPEND cmake/dependencies/Boost.cmake.patched "endif ()\n")
file (STRINGS cmake/dependencies/Boost.cmake sourcecode)
foreach (line_ ${sourcecode})
file (APPEND cmake/dependencies/Boost.cmake.patched "${line_}\n")
endforeach ()
file (RENAME cmake/dependencies/Boost.cmake.patched cmake/dependencies/Boost.cmake)

View File

@@ -53,7 +53,7 @@ Loop: test.app test.jtx
test.app > test.jtx
Loop: test.app test.rpc
test.rpc ~= test.app
test.rpc == test.app
Loop: test.jtx test.toplevel
test.toplevel > test.jtx

View File

@@ -1,7 +1,4 @@
cmake_minimum_required (VERSION 3.16)
set(CMAKE_CXX_EXTENSIONS OFF)
set(CMAKE_CXX_STANDARD 20)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
cmake_minimum_required(VERSION 3.16)
if(POLICY CMP0074)
cmake_policy(SET CMP0074 NEW)
@@ -14,11 +11,11 @@ endif()
file(TO_CMAKE_PATH "${CMAKE_MODULE_PATH}" CMAKE_MODULE_PATH)
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake")
if(POLICY CMP0144)
cmake_policy(SET CMP0144 NEW)
endif()
project(rippled)
set(CMAKE_CXX_EXTENSIONS OFF)
set(CMAKE_CXX_STANDARD 20)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
project (rippled)
set(Boost_NO_BOOST_CMAKE ON)
# make GIT_COMMIT_HASH define available to all sources
@@ -39,22 +36,6 @@ if(thread_safety_analysis)
add_link_options("-stdlib=libc++")
endif()
option(USE_CONAN "Use Conan package manager for dependencies" OFF)
# Then, auto-detect if conan_toolchain.cmake is being used
if(CMAKE_TOOLCHAIN_FILE)
# Check if the toolchain file path contains "conan_toolchain"
if(CMAKE_TOOLCHAIN_FILE MATCHES "conan_toolchain")
set(USE_CONAN ON CACHE BOOL "Using Conan detected from toolchain file" FORCE)
message(STATUS "Conan toolchain detected: ${CMAKE_TOOLCHAIN_FILE}")
message(STATUS "Building with Conan dependencies")
endif()
endif()
if (NOT USE_CONAN)
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake")
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/deps")
endif()
include (CheckCXXCompilerFlag)
include (FetchContent)
include (ExternalProject)
@@ -67,9 +48,6 @@ endif ()
include(RippledSanity)
include(RippledVersion)
include(RippledSettings)
if (NOT USE_CONAN)
include(RippledNIH)
endif()
# this check has to remain in the top-level cmake
# because of the early return statement
if (packages_only)
@@ -82,83 +60,66 @@ include(RippledCompiler)
include(RippledInterface)
###
if (NOT USE_CONAN)
add_subdirectory(src/secp256k1)
add_subdirectory(src/ed25519-donna)
include(deps/Boost)
include(deps/OpenSSL)
# include(deps/Secp256k1)
# include(deps/Ed25519-donna)
include(deps/Lz4)
include(deps/Libarchive)
include(deps/Sqlite)
include(deps/Soci)
include(deps/Snappy)
include(deps/Rocksdb)
include(deps/Nudb)
include(deps/date)
include(deps/Protobuf)
include(deps/gRPC)
include(deps/cassandra)
include(deps/Postgres)
include(deps/WasmEdge)
else()
include(conan/Boost)
find_package(OpenSSL 1.1.1 REQUIRED)
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
)
add_subdirectory(src/secp256k1)
add_subdirectory(src/ed25519-donna)
find_package(lz4 REQUIRED)
# Target names with :: are not allowed in a generator expression.
# We need to pull the include directories and imported location properties
# from separate targets.
find_package(LibArchive REQUIRED)
find_package(SOCI REQUIRED)
find_package(SQLite3 REQUIRED)
find_package(Snappy REQUIRED)
find_package(wasmedge REQUIRED)
option(rocksdb "Enable RocksDB" ON)
if(rocksdb)
find_package(RocksDB REQUIRED)
set_target_properties(RocksDB::rocksdb PROPERTIES
INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1
)
target_link_libraries(ripple_libs INTERFACE RocksDB::rocksdb)
endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
include(conan/Protobuf)
include(conan/gRPC)
if(TARGET nudb::core)
set(nudb nudb::core)
elseif(TARGET NuDB::nudb)
set(nudb NuDB::nudb)
else()
message(FATAL_ERROR "unknown nudb target")
endif()
target_link_libraries(ripple_libs INTERFACE ${nudb})
if(reporting)
find_package(cassandra-cpp-driver REQUIRED)
find_package(PostgreSQL REQUIRED)
target_link_libraries(ripple_libs INTERFACE
cassandra-cpp-driver::cassandra-cpp-driver
PostgreSQL::PostgreSQL
)
endif()
include(deps/Boost)
find_package(OpenSSL 1.1.1 REQUIRED)
set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
)
add_subdirectory(src/secp256k1)
add_subdirectory(src/ed25519-donna)
find_package(lz4 REQUIRED)
# Target names with :: are not allowed in a generator expression.
# We need to pull the include directories and imported location properties
# from separate targets.
find_package(LibArchive REQUIRED)
find_package(SOCI REQUIRED)
find_package(SQLite3 REQUIRED)
find_package(Snappy REQUIRED)
find_package(wasmedge REQUIRED)
option(rocksdb "Enable RocksDB" ON)
if(rocksdb)
find_package(RocksDB REQUIRED)
set_target_properties(RocksDB::rocksdb PROPERTIES
INTERFACE_COMPILE_DEFINITIONS RIPPLE_ROCKSDB_AVAILABLE=1
)
target_link_libraries(ripple_libs INTERFACE RocksDB::rocksdb)
endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
include(deps/Protobuf)
include(deps/gRPC)
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
LibArchive::LibArchive
lz4::lz4
OpenSSL::Crypto
OpenSSL::SSL
Ripple::grpc_pbufs
Ripple::pbufs
secp256k1::secp256k1
soci::soci
SQLite::SQLite3
)
if(TARGET nudb::core)
set(nudb nudb::core)
elseif(TARGET NuDB::nudb)
set(nudb NuDB::nudb)
else()
message(FATAL_ERROR "unknown nudb target")
endif()
target_link_libraries(ripple_libs INTERFACE ${nudb})
if(reporting)
find_package(cassandra-cpp-driver REQUIRED)
find_package(PostgreSQL REQUIRED)
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
LibArchive::LibArchive
lz4::lz4
OpenSSL::Crypto
OpenSSL::SSL
Ripple::grpc_pbufs
Ripple::pbufs
secp256k1::secp256k1
soci::soci
SQLite::SQLite3
cassandra-cpp-driver::cassandra-cpp-driver
PostgreSQL::PostgreSQL
)
endif()

View File

@@ -1,4 +1,4 @@
# Xahau
# Xahau
**Note:** Throughout this README, references to "we" or "our" pertain to the community and contributors involved in the Xahau network. It does not imply a legal entity or a specific collection of individuals.
@@ -67,5 +67,5 @@ git-subtree. See those directories' README files for more details.
- [explorer.xahau.network](https://explorer.xahau.network)
- **Testnet & Faucet**: Test applications and obtain test XAH at [xahau-test.net](https://xahau-test.net) and use the testnet explorer at [explorer.xahau.network](https://explorer.xahau.network).
- **Supporting Wallets**: A list of wallets that support XAH and Xahau-based assets.
- [Xaman](https://xaman.app)
- [Crossmark](https://crossmark.io)
- [Xumm](https://xumm.app)
- [Crossmark](https://crossmark.io)

View File

@@ -8,130 +8,6 @@ This document contains the release notes for `rippled`, the reference server imp
Have new ideas? Need help with setting up your node? [Please open an issue here](https://github.com/xrplf/rippled/issues/new/choose).
# Introducing XRP Ledger version 1.11.0
Version 1.11.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available.
This release reduces memory usage, introduces the `fixNFTokenRemint` amendment, and adds new features and bug fixes. For example, the new NetworkID field in transactions helps to prevent replay attacks with side-chains.
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
<!-- BREAK -->
## Action Required
The `fixNFTokenRemint` amendment is now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
If you operate an XRP Ledger server, upgrade to version 1.11.0 by July 5 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
## Install / Upgrade
On supported platforms, see the [instructions on installing or updating `rippled`](https://xrpl.org/install-rippled.html).
## What's Changed
### New Features and Improvements
* Allow port numbers be be specified using a either a colon or a space by @RichardAH in https://github.com/XRPLF/rippled/pull/4328
* Eliminate memory allocation from critical path: by @nbougalis in https://github.com/XRPLF/rippled/pull/4353
* Make it easy for projects to depend on libxrpl by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4449
* Add the ability to mark amendments as obsolete by @ximinez in https://github.com/XRPLF/rippled/pull/4291
* Always create the FeeSettings object in genesis ledger by @ximinez in https://github.com/XRPLF/rippled/pull/4319
* Log exception messages in several locations by @drlongle in https://github.com/XRPLF/rippled/pull/4400
* Parse flags in account_info method by @drlongle in https://github.com/XRPLF/rippled/pull/4459
* Add NFTokenPages to account_objects RPC by @RichardAH in https://github.com/XRPLF/rippled/pull/4352
* add jss fields used by clio `nft_info` by @ledhed2222 in https://github.com/XRPLF/rippled/pull/4320
* Introduce a slab-based memory allocator and optimize SHAMapItem by @nbougalis in https://github.com/XRPLF/rippled/pull/4218
* Add NetworkID field to transactions to help prevent replay attacks on and from side-chains by @RichardAH in https://github.com/XRPLF/rippled/pull/4370
* If present, set quorum based on command line. by @mtrippled in https://github.com/XRPLF/rippled/pull/4489
* API does not accept seed or public key for account by @drlongle in https://github.com/XRPLF/rippled/pull/4404
* Add `nftoken_id`, `nftoken_ids` and `offer_id` meta fields into NFT `Tx` responses by @shawnxie999 in https://github.com/XRPLF/rippled/pull/4447
### Bug Fixes
* fix(gateway_balances): handle overflow exception by @RichardAH in https://github.com/XRPLF/rippled/pull/4355
* fix(ValidatorSite): handle rare null pointer dereference in timeout by @ximinez in https://github.com/XRPLF/rippled/pull/4420
* RPC commands understand markers derived from all ledger object types by @ximinez in https://github.com/XRPLF/rippled/pull/4361
* `fixNFTokenRemint`: prevent NFT re-mint: by @shawnxie999 in https://github.com/XRPLF/rippled/pull/4406
* Fix a case where ripple::Expected returned a json array, not a value by @scottschurr in https://github.com/XRPLF/rippled/pull/4401
* fix: Ledger data returns an empty list (instead of null) when all entries are filtered out by @drlongle in https://github.com/XRPLF/rippled/pull/4398
* Fix unit test ripple.app.LedgerData by @drlongle in https://github.com/XRPLF/rippled/pull/4484
* Fix the fix for std::result_of by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4496
* Fix errors for Clang 16 by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4501
* Ensure that switchover vars are initialized before use: by @seelabs in https://github.com/XRPLF/rippled/pull/4527
* Move faulty assert by @ximinez in https://github.com/XRPLF/rippled/pull/4533
* Fix unaligned load and stores: (#4528) by @seelabs in https://github.com/XRPLF/rippled/pull/4531
* fix node size estimation by @dangell7 in https://github.com/XRPLF/rippled/pull/4536
* fix: remove redundant moves by @ckeshava in https://github.com/XRPLF/rippled/pull/4565
### Code Cleanup and Testing
* Replace compare() with the three-way comparison operator in base_uint, Issue and Book by @drlongle in https://github.com/XRPLF/rippled/pull/4411
* Rectify the import paths of boost::function_output_iterator by @ckeshava in https://github.com/XRPLF/rippled/pull/4293
* Expand Linux test matrix by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4454
* Add patched recipe for SOCI by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4510
* Switch to self-hosted runners for macOS by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4511
* [TRIVIAL] Add missing includes by @seelabs in https://github.com/XRPLF/rippled/pull/4555
### Docs
* Refactor build instructions by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4381
* Add install instructions for package managers by @thejohnfreeman in https://github.com/XRPLF/rippled/pull/4472
* Fix typo by @solmsted in https://github.com/XRPLF/rippled/pull/4508
* Update environment.md by @sappenin in https://github.com/XRPLF/rippled/pull/4498
* Update BUILD.md by @oeggert in https://github.com/XRPLF/rippled/pull/4514
* Trivial: add comments for NFToken-related invariants by @scottschurr in https://github.com/XRPLF/rippled/pull/4558
## New Contributors
* @drlongle made their first contribution in https://github.com/XRPLF/rippled/pull/4411
* @ckeshava made their first contribution in https://github.com/XRPLF/rippled/pull/4293
* @solmsted made their first contribution in https://github.com/XRPLF/rippled/pull/4508
* @sappenin made their first contribution in https://github.com/XRPLF/rippled/pull/4498
* @oeggert made their first contribution in https://github.com/XRPLF/rippled/pull/4514
**Full Changelog**: https://github.com/XRPLF/rippled/compare/1.10.1...1.11.0
### GitHub
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
### Credits
The following people contributed directly to this release:
- Alloy Networks <45832257+alloynetworks@users.noreply.github.com>
- Brandon Wilson <brandon@coil.com>
- Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
- David Fuelling <sappenin@gmail.com>
- Denis Angell <dangell@transia.co>
- Ed Hennis <ed@ripple.com>
- Elliot Lee <github.public@intelliot.com>
- John Freeman <jfreeman08@gmail.com>
- Mark Travis <mtrippled@users.noreply.github.com>
- Nik Bougalis <nikb@bougalis.net>
- RichardAH <richard.holland@starstone.co.nz>
- Scott Determan <scott.determan@yahoo.com>
- Scott Schurr <scott@ripple.com>
- Shawn Xie <35279399+shawnxie999@users.noreply.github.com>
- drlongle <drlongle@gmail.com>
- ledhed2222 <ledhed2222@users.noreply.github.com>
- oeggert <117319296+oeggert@users.noreply.github.com>
- solmsted <steven.olm@gmail.com>
Bug Bounties and Responsible Disclosures:
We welcome reviews of the rippled code and urge researchers to
responsibly disclose any issues they may find.
To report a bug, please send a detailed report to:
bugs@xrpl.org
# Introducing XRP Ledger version 1.10.1
Version 1.10.1 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release restores packages for Ubuntu 18.04.

View File

@@ -61,12 +61,13 @@ For these complaints or reports, please [contact our support team](mailto:bugs@x
### The following type of security problems are excluded
1. **In scope**. Only bugs in software under the scope of the program qualify. Currently, that means `xahaud` and `xahau-lib`.
2. **Relevant**. A security issue, posing a danger to user funds, privacy or the operation of the Xahau Ledger.
3. **Original and previously unknown**. Bugs that are already known and discussed in public do not qualify. Previously reported bugs, even if publicly unknown, are not eligible.
4. **Specific**. We welcome general security advice or recommendations, but we cannot pay bounties for that.
5. **Fixable**. There has to be something we can do to permanently fix the problem. Note that bugs in other peoples software may still qualify in some cases. For example, if you find a bug in a library that we use which can compromise the security of software that is in scope and we can get it fixed, you may qualify for a bounty.
6. **Unused**. If you use the exploit to attack the Xahau Ledger, you do not qualify for a bounty. If you report a vulnerability used in an ongoing or past attack and there is specific, concrete evidence that suggests you are the attacker we reserve the right not to pay a bounty.
- (D)DOS attacks
- Error messages or error pages without sensitive data
- Tests & sample data as publicly available in our repositories at Github
- Common issues like browser header warnings or DNS configuration, identified by vulnerability scans
- Vulnerability scan reports for software we publicly use
- Security issues related to outdated OS's, browsers or plugins
- Reports for security problems that we have been notified of before
Please note: Reports that are lacking any proof (such as screenshots or other data), detailed information or details on how to reproduce any unexpected result will be investigated but will not be eligible for any reward.

View File

@@ -1,11 +1,4 @@
#!/bin/bash -u
# We use set -e and bash with -u to bail on first non zero exit code of any
# processes launched or upon any unbound variable.
# We use set -x to print commands before running them to help with
# debugging.
set -ex
set -e
#!/bin/bash
echo "START INSIDE CONTAINER - CORE"
@@ -30,12 +23,12 @@ fi
perl -i -pe "s/^(\\s*)-DBUILD_SHARED_LIBS=OFF/\\1-DBUILD_SHARED_LIBS=OFF\\n\\1-DROCKSDB_BUILD_SHARED=OFF/g" Builds/CMake/deps/Rocksdb.cmake &&
mv Builds/CMake/deps/WasmEdge.cmake Builds/CMake/deps/WasmEdge.old &&
echo "find_package(LLVM REQUIRED CONFIG)
message(STATUS \"Found LLVM \${LLVM_PACKAGE_VERSION}\")
message(STATUS \"Found LLVM ${LLVM_PACKAGE_VERSION}\")
message(STATUS \"Using LLVMConfig.cmake in: \${LLVM_DIR}\")
add_library (wasmedge STATIC IMPORTED GLOBAL)
set_target_properties(wasmedge PROPERTIES IMPORTED_LOCATION \${WasmEdge_LIB})
target_link_libraries (ripple_libs INTERFACE wasmedge)
add_library (wasmedge::wasmedge ALIAS wasmedge)
add_library (NIH::WasmEdge ALIAS wasmedge)
message(\"WasmEdge DONE\")
" > Builds/CMake/deps/WasmEdge.cmake &&
git checkout src/ripple/protocol/impl/BuildInfo.cpp &&

40
build-full.sh Normal file → Executable file
View File

@@ -1,11 +1,4 @@
#!/bin/bash -u
# We use set -e and bash with -u to bail on first non zero exit code of any
# processes launched or upon any unbound variable.
# We use set -x to print commands before running them to help with
# debugging.
set -ex
set -e
#!/bin/bash
echo "START INSIDE CONTAINER - FULL"
@@ -16,17 +9,8 @@ echo "-- GITHUB_RUN_NUMBER: $4"
umask 0000;
echo "Fixing CentOS 7 EOL"
sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-*
sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-*
yum clean all
yum-config-manager --disable centos-sclo-sclo
####
cd /io;
mkdir -p src/certs;
mkdir src/certs;
curl --silent -k https://raw.githubusercontent.com/RichardAH/rippled-release-builder/main/ca-bundle/certbundle.h -o src/certs/certbundle.h;
if [ "`grep certbundle.h src/ripple/net/impl/RegisterSSLCerts.cpp | wc -l`" -eq "0" ]
then
@@ -73,8 +57,8 @@ then
#endif/g" src/ripple/net/impl/RegisterSSLCerts.cpp &&
sed -i "s/#include <ripple\/net\/RegisterSSLCerts.h>/\0\n#include <certs\/certbundle.h>/g" src/ripple/net/impl/RegisterSSLCerts.cpp
fi
mkdir -p .nih_c;
mkdir -p .nih_toolchain;
mkdir .nih_c;
mkdir .nih_toolchain;
cd .nih_toolchain &&
yum install -y wget lz4 lz4-devel git llvm13-static.x86_64 llvm13-devel.x86_64 devtoolset-10-binutils zlib-static ncurses-static -y \
devtoolset-7-gcc-c++ \
@@ -97,11 +81,11 @@ echo "-- Install Cmake 3.23.1 --" &&
pwd &&
( wget -nc -q https://github.com/Kitware/CMake/releases/download/v3.23.1/cmake-3.23.1-linux-x86_64.tar.gz; echo "" ) &&
tar -xzf cmake-3.23.1-linux-x86_64.tar.gz -C /hbb/ &&
echo "-- Install Boost 1.86.0 --" &&
echo "-- Install Boost 1.75.0 --" &&
pwd &&
( wget -nc -q https://archives.boost.io/release/1.86.0/source/boost_1_86_0.tar.gz; echo "" ) &&
tar -xzf boost_1_86_0.tar.gz &&
cd boost_1_86_0 && ./bootstrap.sh && ./b2 link=static -j$3 && ./b2 install &&
( wget -nc -q https://boostorg.jfrog.io/artifactory/main/release/1.75.0/source/boost_1_75_0.tar.gz; echo "" ) &&
tar -xzf boost_1_75_0.tar.gz &&
cd boost_1_75_0 && ./bootstrap.sh && ./b2 link=static -j$3 && ./b2 install &&
cd ../ &&
echo "-- Install Protobuf 3.20.0 --" &&
pwd &&
@@ -122,7 +106,7 @@ tar -xf libunwind-13.0.1.src.tar.xz &&
cp -r libunwind-13.0.1.src/include libunwind-13.0.1.src/src lld-13.0.1.src/ &&
cd lld-13.0.1.src &&
rm -rf build CMakeCache.txt &&
mkdir -p build &&
mkdir build &&
cd build &&
cmake .. -DLLVM_LIBRARY_DIR=/usr/lib64/llvm13/lib/ -DCMAKE_INSTALL_PREFIX=/usr/lib64/llvm13/ -DCMAKE_BUILD_TYPE=Release &&
make -j$3 install &&
@@ -132,11 +116,11 @@ cd ../../ &&
echo "-- Build WasmEdge --" &&
( wget -nc -q https://github.com/WasmEdge/WasmEdge/archive/refs/tags/0.11.2.zip; unzip -o 0.11.2.zip; ) &&
cd WasmEdge-0.11.2 &&
( mkdir -p build; echo "" ) &&
( mkdir build; echo "" ) &&
cd build &&
export BOOST_ROOT="/usr/local/src/boost_1_86_0" &&
export BOOST_ROOT="/usr/local/src/boost_1_75_0" &&
export Boost_LIBRARY_DIRS="/usr/local/lib" &&
export BOOST_INCLUDEDIR="/usr/local/src/boost_1_86_0" &&
export BOOST_INCLUDEDIR="/usr/local/src/boost_1_75_0" &&
export PATH=`echo $PATH | sed -E "s/devtoolset-7/devtoolset-9/g"` &&
cmake .. \
-DCMAKE_BUILD_TYPE=Release \

View File

@@ -1056,18 +1056,7 @@
# Cassandra is an alternative backend to be used only with Reporting Mode.
# See the Reporting Mode section for more details about Reporting Mode.
#
# type = RWDB
#
# RWDB is a high-performance memory store written by XRPL-Labs and optimized
# for xahaud. RWDB is NOT persistent and the data will be lost on restart.
# RWDB is recommended for Validator and Peer nodes that are not required to
# store history.
#
# RWDB maintains its high speed regardless of the amount of history
# stored. Online delete should NOT be used instead RWDB will use the
# ledger_history config value to determine how many ledgers to keep in memory.
#
# Required keys for NuDB, RWDB and RocksDB:
# Required keys for NuDB and RocksDB:
#
# path Location to store the database
#
@@ -1123,8 +1112,7 @@
# online_delete Minimum value of 256. Enable automatic purging
# of older ledger information. Maintain at least this
# number of ledger records online. Must be greater
# than or equal to ledger_history. If using RWDB
# this value is ignored.
# than or equal to ledger_history.
#
# These keys modify the behavior of online_delete, and thus are only
# relevant if online_delete is defined and non-zero:

View File

@@ -144,12 +144,4 @@ D686F2538F410C9D0D856788E98E3579595DAF7B38D38887F81ECAC934B06040 HooksUpdate1
86E83A7D2ECE3AD5FA87AB2195AE015C950469ABF0B72EAACED318F74886AE90 CryptoConditionsSuite
3C43D9A973AA4443EF3FC38E42DD306160FBFFDAB901CD8BAA15D09F2597EB87 NonFungibleTokensV1
0285B7E5E08E1A8E4C15636F0591D87F73CB6A7B6452A932AD72BBC8E5D1CBE3 fixNFTokenDirV1
36799EA497B1369B170805C078AEFE6188345F9B3E324C21E9CA3FF574E3C3D6 fixNFTokenNegOffer
4C499D17719BB365B69010A436B64FD1A82AAB199FC1CEB06962EBD01059FB09 fixXahauV1
215181D23BF5C173314B5FDB9C872C92DE6CC918483727DE037C0C13E7E6EE9D fixXahauV2
0D8BF22FF7570D58598D1EF19EBB6E142AD46E59A223FD3816262FBB69345BEA Remit
7CA0426E7F411D39BB014E57CD9E08F61DE1750F0D41FCD428D9FB80BB7596B0 ZeroB2M
4B8466415FAB32FFA89D9DCBE166A42340115771DF611A7160F8D7439C87ECD8 fixNSDelete
EDB4EE4C524E16BDD91D9A529332DED08DCAAA51CC6DC897ACFA1A0ED131C5B6 fix240819
8063140E9260799D6716756B891CEC3E7006C4E4F277AB84670663A88F94B9C4 fixPageCap
88693F108C3CD8A967F3F4253A32DEF5E35F9406ACD2A11B88B11D90865763A9 fix240911
36799EA497B1369B170805C078AEFE6188345F9B3E324C21E9CA3FF574E3C3D6 fixNFTokenNegOffer

View File

@@ -1,4 +1,4 @@
from conan import ConanFile
from conans import ConanFile
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
import re
@@ -24,7 +24,7 @@ class Xrpl(ConanFile):
}
requires = [
'boost/1.86.0',
'boost/1.82.0',
'date/3.0.1',
'libarchive/3.6.0',
'lz4/1.9.3',
@@ -109,9 +109,7 @@ class Xrpl(ConanFile):
if self.options.rocksdb:
self.requires('rocksdb/6.27.3')
exports_sources = (
'CMakeLists.txt', 'Builds/*', 'bin/getRippledInfo', 'src/*', 'cfg/*'
)
exports_sources = 'CMakeLists.txt', 'Builds/CMake/*', 'src/*', 'cfg/*'
def layout(self):
cmake_layout(self)
@@ -145,11 +143,8 @@ class Xrpl(ConanFile):
cmake.install()
def package_info(self):
libxrpl = self.cpp_info.components['libxrpl']
libxrpl.libs = [
self.cpp_info.libs = [
'libxrpl_core.a',
'libed25519.a',
'libed25519-donna.a',
'libsecp256k1.a',
]
libxrpl.includedirs = ['include']
libxrpl.requires = ['boost::boost']

11
docker-unit-tests.sh Executable file → Normal file
View File

@@ -1,11 +1,4 @@
#!/bin/bash -x
#!/bin/bash
BUILD_CORES=$(echo "scale=0 ; `nproc` / 1.337" | bc)
docker run --rm -i -v $(pwd):/io ubuntu sh -c '/io/release-build/xahaud -u'
if [[ "$GITHUB_REPOSITORY" == "" ]]; then
#Default
BUILD_CORES=8
fi
echo "Mounting $(pwd)/io in ubuntu and running unit tests"
docker run --rm -i -v $(pwd):/io -e BUILD_CORES=$BUILD_CORES ubuntu sh -c '/io/release-build/xahaud --unittest-jobs $BUILD_CORES -u'

View File

@@ -1,84 +0,0 @@
Our [build instructions][BUILD.md] assume you have a C++ development
environment complete with Git, Python, Conan, CMake, and a C++ compiler.
This document exists to help readers set one up on any of the Big Three
platforms: Linux, macOS, or Windows.
[BUILD.md]: ../../BUILD.md
## Linux
Package ecosystems vary across Linux distributions,
so there is no one set of instructions that will work for every Linux user.
These instructions are written for Ubuntu 22.04.
They are largely copied from the [script][1] used to configure our Docker
container for continuous integration.
That script handles many more responsibilities.
These instructions are just the bare minimum to build one configuration of
rippled.
You can check that codebase for other Linux distributions and versions.
If you cannot find yours there,
then we hope that these instructions can at least guide you in the right
direction.
```
apt update
apt install --yes curl git libssl-dev python3.10-dev python3-pip make g++-11
curl --location --remote-name \
"https://github.com/Kitware/CMake/releases/download/v3.25.1/cmake-3.25.1.tar.gz"
tar -xzf cmake-3.25.1.tar.gz
rm cmake-3.25.1.tar.gz
cd cmake-3.25.1
./bootstrap --parallel=$(nproc)
make --jobs $(nproc)
make install
cd ..
pip3 install 'conan<2'
```
[1]: https://github.com/thejohnfreeman/rippled-docker/blob/master/ubuntu-22.04/install.sh
## macOS
Open a Terminal and enter the below command to bring up a dialog to install
the command line developer tools.
Once it is finished, this command should return a version greater than the
minimum required (see [BUILD.md][]).
```
clang --version
```
The command line developer tools should include Git too:
```
git --version
```
Install [Homebrew][],
use it to install [pyenv][],
use it to install Python,
and use it to install Conan:
[Homebrew]: https://brew.sh/
[pyenv]: https://github.com/pyenv/pyenv
```
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
brew update
brew install xz
brew install pyenv
pyenv install 3.10-dev
pyenv global 3.10-dev
eval "$(pyenv init -)"
pip install 'conan<2'
```
Install CMake with Homebrew too:
```
brew install cmake
```

159
docs/build/install.md vendored
View File

@@ -1,159 +0,0 @@
This document contains instructions for installing rippled.
The APT package manager is common on Debian-based Linux distributions like
Ubuntu,
while the YUM package manager is common on Red Hat-based Linux distributions
like CentOS.
Installing from source is an option for all platforms,
and the only supported option for installing custom builds.
## From source
From a source build, you can install rippled and libxrpl using CMake's
`--install` mode:
```
cmake --install . --prefix /opt/local
```
The default [prefix][1] is typically `/usr/local` on Linux and macOS and
`C:/Program Files/rippled` on Windows.
[1]: https://cmake.org/cmake/help/latest/variable/CMAKE_INSTALL_PREFIX.html
## With the APT package manager
1. Update repositories:
sudo apt update -y
2. Install utilities:
sudo apt install -y apt-transport-https ca-certificates wget gnupg
3. Add Ripple's package-signing GPG key to your list of trusted keys:
sudo mkdir /usr/local/share/keyrings/
wget -q -O - "https://repos.ripple.com/repos/api/gpg/key/public" | gpg --dearmor > ripple-key.gpg
sudo mv ripple-key.gpg /usr/local/share/keyrings
4. Check the fingerprint of the newly-added key:
gpg /usr/local/share/keyrings/ripple-key.gpg
The output should include an entry for Ripple such as the following:
gpg: WARNING: no command supplied. Trying to guess what you mean ...
pub rsa3072 2019-02-14 [SC] [expires: 2026-02-17]
C0010EC205B35A3310DC90DE395F97FFCCAFD9A2
uid TechOps Team at Ripple <techops+rippled@ripple.com>
sub rsa3072 2019-02-14 [E] [expires: 2026-02-17]
In particular, make sure that the fingerprint matches. (In the above example, the fingerprint is on the third line, starting with `C001`.)
4. Add the appropriate Ripple repository for your operating system version:
echo "deb [signed-by=/usr/local/share/keyrings/ripple-key.gpg] https://repos.ripple.com/repos/rippled-deb focal stable" | \
sudo tee -a /etc/apt/sources.list.d/ripple.list
The above example is appropriate for **Ubuntu 20.04 Focal Fossa**. For other operating systems, replace the word `focal` with one of the following:
- `jammy` for **Ubuntu 22.04 Jammy Jellyfish**
- `bionic` for **Ubuntu 18.04 Bionic Beaver**
- `bullseye` for **Debian 11 Bullseye**
- `buster` for **Debian 10 Buster**
If you want access to development or pre-release versions of `rippled`, use one of the following instead of `stable`:
- `unstable` - Pre-release builds ([`release` branch](https://github.com/ripple/rippled/tree/release))
- `nightly` - Experimental/development builds ([`develop` branch](https://github.com/ripple/rippled/tree/develop))
**Warning:** Unstable and nightly builds may be broken at any time. Do not use these builds for production servers.
5. Fetch the Ripple repository.
sudo apt -y update
6. Install the `rippled` software package:
sudo apt -y install rippled
7. Check the status of the `rippled` service:
systemctl status rippled.service
The `rippled` service should start automatically. If not, you can start it manually:
sudo systemctl start rippled.service
8. Optional: allow `rippled` to bind to privileged ports.
This allows you to serve incoming API requests on port 80 or 443. (If you want to do so, you must also update the config file's port settings.)
sudo setcap 'cap_net_bind_service=+ep' /opt/ripple/bin/rippled
## With the YUM package manager
1. Install the Ripple RPM repository:
Choose the appropriate RPM repository for the stability of releases you want:
- `stable` for the latest production release (`master` branch)
- `unstable` for pre-release builds (`release` branch)
- `nightly` for experimental/development builds (`develop` branch)
*Stable*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-stable]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/stable/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/stable/repodata/repomd.xml.key
REPOFILE
*Unstable*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-unstable]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/unstable/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/unstable/repodata/repomd.xml.key
REPOFILE
*Nightly*
cat << REPOFILE | sudo tee /etc/yum.repos.d/ripple.repo
[ripple-nightly]
name=XRP Ledger Packages
enabled=1
gpgcheck=0
repo_gpgcheck=1
baseurl=https://repos.ripple.com/repos/rippled-rpm/nightly/
gpgkey=https://repos.ripple.com/repos/rippled-rpm/nightly/repodata/repomd.xml.key
REPOFILE
2. Fetch the latest repo updates:
sudo yum -y update
3. Install the new `rippled` package:
sudo yum install -y rippled
4. Configure the `rippled` service to start on boot:
sudo systemctl enable rippled.service
5. Start the `rippled` service:
sudo systemctl start rippled.service

View File

@@ -1,40 +0,0 @@
sources:
"1.1.10":
url: "https://github.com/google/snappy/archive/1.1.10.tar.gz"
sha256: "49d831bffcc5f3d01482340fe5af59852ca2fe76c3e05df0e67203ebbe0f1d90"
"1.1.9":
url: "https://github.com/google/snappy/archive/1.1.9.tar.gz"
sha256: "75c1fbb3d618dd3a0483bff0e26d0a92b495bbe5059c8b4f1c962b478b6e06e7"
"1.1.8":
url: "https://github.com/google/snappy/archive/1.1.8.tar.gz"
sha256: "16b677f07832a612b0836178db7f374e414f94657c138e6993cbfc5dcc58651f"
"1.1.7":
url: "https://github.com/google/snappy/archive/1.1.7.tar.gz"
sha256: "3dfa02e873ff51a11ee02b9ca391807f0c8ea0529a4924afa645fbf97163f9d4"
patches:
"1.1.10":
- patch_file: "patches/1.1.10-0001-fix-inlining-failure.patch"
patch_description: "disable inlining for compilation error"
patch_type: "portability"
- patch_file: "patches/1.1.9-0002-no-Werror.patch"
patch_description: "disable 'warning as error' options"
patch_type: "portability"
- patch_file: "patches/1.1.10-0003-fix-clobber-list-older-llvm.patch"
patch_description: "disable inline asm on apple-clang"
patch_type: "portability"
- patch_file: "patches/1.1.9-0004-rtti-by-default.patch"
patch_description: "remove 'disable rtti'"
patch_type: "conan"
"1.1.9":
- patch_file: "patches/1.1.9-0001-fix-inlining-failure.patch"
patch_description: "disable inlining for compilation error"
patch_type: "portability"
- patch_file: "patches/1.1.9-0002-no-Werror.patch"
patch_description: "disable 'warning as error' options"
patch_type: "portability"
- patch_file: "patches/1.1.9-0003-fix-clobber-list-older-llvm.patch"
patch_description: "disable inline asm on apple-clang"
patch_type: "portability"
- patch_file: "patches/1.1.9-0004-rtti-by-default.patch"
patch_description: "remove 'disable rtti'"
patch_type: "conan"

View File

@@ -1,89 +0,0 @@
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get, rmdir
from conan.tools.scm import Version
import os
required_conan_version = ">=1.54.0"
class SnappyConan(ConanFile):
name = "snappy"
description = "A fast compressor/decompressor"
topics = ("google", "compressor", "decompressor")
url = "https://github.com/conan-io/conan-center-index"
homepage = "https://github.com/google/snappy"
license = "BSD-3-Clause"
package_type = "library"
settings = "os", "arch", "compiler", "build_type"
options = {
"shared": [True, False],
"fPIC": [True, False],
}
default_options = {
"shared": False,
"fPIC": True,
}
def export_sources(self):
export_conandata_patches(self)
def config_options(self):
if self.settings.os == 'Windows':
del self.options.fPIC
def configure(self):
if self.options.shared:
self.options.rm_safe("fPIC")
def layout(self):
cmake_layout(self, src_folder="src")
def validate(self):
if self.settings.compiler.get_safe("cppstd"):
check_min_cppstd(self, 11)
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["SNAPPY_BUILD_TESTS"] = False
if Version(self.version) >= "1.1.8":
tc.variables["SNAPPY_FUZZING_BUILD"] = False
tc.variables["SNAPPY_REQUIRE_AVX"] = False
tc.variables["SNAPPY_REQUIRE_AVX2"] = False
tc.variables["SNAPPY_INSTALL"] = True
if Version(self.version) >= "1.1.9":
tc.variables["SNAPPY_BUILD_BENCHMARKS"] = False
tc.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
copy(self, "COPYING", src=self.source_folder, dst=os.path.join(self.package_folder, "licenses"))
cmake = CMake(self)
cmake.install()
rmdir(self, os.path.join(self.package_folder, "lib", "cmake"))
def package_info(self):
self.cpp_info.set_property("cmake_file_name", "Snappy")
self.cpp_info.set_property("cmake_target_name", "Snappy::snappy")
# TODO: back to global scope in conan v2 once cmake_find_package* generators removed
self.cpp_info.components["snappylib"].libs = ["snappy"]
if not self.options.shared:
if self.settings.os in ["Linux", "FreeBSD"]:
self.cpp_info.components["snappylib"].system_libs.append("m")
# TODO: to remove in conan v2 once cmake_find_package* generators removed
self.cpp_info.names["cmake_find_package"] = "Snappy"
self.cpp_info.names["cmake_find_package_multi"] = "Snappy"
self.cpp_info.components["snappylib"].names["cmake_find_package"] = "snappy"
self.cpp_info.components["snappylib"].names["cmake_find_package_multi"] = "snappy"
self.cpp_info.components["snappylib"].set_property("cmake_target_name", "Snappy::snappy")

View File

@@ -1,13 +0,0 @@
diff --git a/snappy-stubs-internal.h b/snappy-stubs-internal.h
index 1548ed7..3b4a9f3 100644
--- a/snappy-stubs-internal.h
+++ b/snappy-stubs-internal.h
@@ -100,7 +100,7 @@
// Inlining hints.
#if HAVE_ATTRIBUTE_ALWAYS_INLINE
-#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE __attribute__((always_inline))
+#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#else
#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#endif // HAVE_ATTRIBUTE_ALWAYS_INLINE

View File

@@ -1,13 +0,0 @@
diff --git a/snappy.cc b/snappy.cc
index d414718..e4efb59 100644
--- a/snappy.cc
+++ b/snappy.cc
@@ -1132,7 +1132,7 @@ inline size_t AdvanceToNextTagX86Optimized(const uint8_t** ip_p, size_t* tag) {
size_t literal_len = *tag >> 2;
size_t tag_type = *tag;
bool is_literal;
-#if defined(__GCC_ASM_FLAG_OUTPUTS__) && defined(__x86_64__)
+#if defined(__GCC_ASM_FLAG_OUTPUTS__) && defined(__x86_64__) && ( (!defined(__clang__) && !defined(__APPLE__)) || (!defined(__APPLE__) && defined(__clang__) && (__clang_major__ >= 9)) || (defined(__APPLE__) && defined(__clang__) && (__clang_major__ > 11)) )
// TODO clang misses the fact that the (c & 3) already correctly
// sets the zero flag.
asm("and $3, %k[tag_type]\n\t"

View File

@@ -1,14 +0,0 @@
Fixes the following error:
error: inlining failed in call to always_inline size_t snappy::AdvanceToNextTag(const uint8_t**, size_t*): function body can be overwritten at link time
--- snappy-stubs-internal.h
+++ snappy-stubs-internal.h
@@ -100,7 +100,7 @@
// Inlining hints.
#ifdef HAVE_ATTRIBUTE_ALWAYS_INLINE
-#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE __attribute__((always_inline))
+#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#else
#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#endif

View File

@@ -1,12 +0,0 @@
--- CMakeLists.txt
+++ CMakeLists.txt
@@ -69,7 +69,7 @@
- # Use -Werror for clang only.
+if(0)
if(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
if(NOT CMAKE_CXX_FLAGS MATCHES "-Werror")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror")
endif(NOT CMAKE_CXX_FLAGS MATCHES "-Werror")
endif(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
-
+endif()

View File

@@ -1,12 +0,0 @@
asm clobbers do not work for clang < 9 and apple-clang < 11 (found by SpaceIm)
--- snappy.cc
+++ snappy.cc
@@ -1026,7 +1026,7 @@
size_t literal_len = *tag >> 2;
size_t tag_type = *tag;
bool is_literal;
-#if defined(__GNUC__) && defined(__x86_64__)
+#if defined(__GNUC__) && defined(__x86_64__) && ( (!defined(__clang__) && !defined(__APPLE__)) || (!defined(__APPLE__) && defined(__clang__) && (__clang_major__ >= 9)) || (defined(__APPLE__) && defined(__clang__) && (__clang_major__ > 11)) )
// TODO clang misses the fact that the (c & 3) already correctly
// sets the zero flag.
asm("and $3, %k[tag_type]\n\t"

View File

@@ -1,20 +0,0 @@
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -53,8 +53,6 @@ if(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
add_definitions(-D_HAS_EXCEPTIONS=0)
# Disable RTTI.
- string(REGEX REPLACE "/GR" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /GR-")
else(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
# Use -Wall for clang and gcc.
if(NOT CMAKE_CXX_FLAGS MATCHES "-Wall")
@@ -78,8 +76,6 @@ endif()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fno-exceptions")
# Disable RTTI.
- string(REGEX REPLACE "-frtti" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fno-rtti")
endif(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
# BUILD_SHARED_LIBS is a standard CMake variable, but we declare it here to make

View File

@@ -1,12 +0,0 @@
sources:
"4.0.3":
url: "https://github.com/SOCI/soci/archive/v4.0.3.tar.gz"
sha256: "4b1ff9c8545c5d802fbe06ee6cd2886630e5c03bf740e269bb625b45cf934928"
patches:
"4.0.3":
- patch_file: "patches/0001-Remove-hardcoded-INSTALL_NAME_DIR-for-relocatable-li.patch"
patch_description: "Generate relocatable libraries on MacOS"
patch_type: "portability"
- patch_file: "patches/0002-Fix-soci_backend.patch"
patch_description: "Fix variable names for dependencies"
patch_type: "conan"

View File

@@ -1,212 +0,0 @@
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.cmake import CMake, CMakeDeps, CMakeToolchain, cmake_layout
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get, rmdir
from conan.tools.microsoft import is_msvc
from conan.tools.scm import Version
from conan.errors import ConanInvalidConfiguration
import os
required_conan_version = ">=1.55.0"
class SociConan(ConanFile):
name = "soci"
homepage = "https://github.com/SOCI/soci"
url = "https://github.com/conan-io/conan-center-index"
description = "The C++ Database Access Library "
topics = ("mysql", "odbc", "postgresql", "sqlite3")
license = "BSL-1.0"
settings = "os", "arch", "compiler", "build_type"
options = {
"shared": [True, False],
"fPIC": [True, False],
"empty": [True, False],
"with_sqlite3": [True, False],
"with_db2": [True, False],
"with_odbc": [True, False],
"with_oracle": [True, False],
"with_firebird": [True, False],
"with_mysql": [True, False],
"with_postgresql": [True, False],
"with_boost": [True, False],
}
default_options = {
"shared": False,
"fPIC": True,
"empty": False,
"with_sqlite3": False,
"with_db2": False,
"with_odbc": False,
"with_oracle": False,
"with_firebird": False,
"with_mysql": False,
"with_postgresql": False,
"with_boost": False,
}
def export_sources(self):
export_conandata_patches(self)
def layout(self):
cmake_layout(self, src_folder="src")
def config_options(self):
if self.settings.os == "Windows":
self.options.rm_safe("fPIC")
def configure(self):
if self.options.shared:
self.options.rm_safe("fPIC")
def requirements(self):
if self.options.with_sqlite3:
self.requires("sqlite3/3.41.1")
if self.options.with_odbc and self.settings.os != "Windows":
self.requires("odbc/2.3.11")
if self.options.with_mysql:
self.requires("libmysqlclient/8.0.31")
if self.options.with_postgresql:
self.requires("libpq/14.7")
if self.options.with_boost:
self.requires("boost/1.81.0")
@property
def _minimum_compilers_version(self):
return {
"Visual Studio": "14",
"gcc": "4.8",
"clang": "3.8",
"apple-clang": "8.0"
}
def validate(self):
if self.settings.compiler.get_safe("cppstd"):
check_min_cppstd(self, 11)
compiler = str(self.settings.compiler)
compiler_version = Version(self.settings.compiler.version.value)
if compiler not in self._minimum_compilers_version:
self.output.warning("{} recipe lacks information about the {} compiler support.".format(self.name, self.settings.compiler))
elif compiler_version < self._minimum_compilers_version[compiler]:
raise ConanInvalidConfiguration("{} requires a {} version >= {}".format(self.name, compiler, compiler_version))
prefix = "Dependencies for"
message = "not configured in this conan package."
if self.options.with_db2:
# self.requires("db2/0.0.0") # TODO add support for db2
raise ConanInvalidConfiguration("{} DB2 {} ".format(prefix, message))
if self.options.with_oracle:
# self.requires("oracle_db/0.0.0") # TODO add support for oracle
raise ConanInvalidConfiguration("{} ORACLE {} ".format(prefix, message))
if self.options.with_firebird:
# self.requires("firebird/0.0.0") # TODO add support for firebird
raise ConanInvalidConfiguration("{} firebird {} ".format(prefix, message))
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["SOCI_SHARED"] = self.options.shared
tc.variables["SOCI_STATIC"] = not self.options.shared
tc.variables["SOCI_TESTS"] = False
tc.variables["SOCI_CXX11"] = True
tc.variables["SOCI_EMPTY"] = self.options.empty
tc.variables["WITH_SQLITE3"] = self.options.with_sqlite3
tc.variables["WITH_DB2"] = self.options.with_db2
tc.variables["WITH_ODBC"] = self.options.with_odbc
tc.variables["WITH_ORACLE"] = self.options.with_oracle
tc.variables["WITH_FIREBIRD"] = self.options.with_firebird
tc.variables["WITH_MYSQL"] = self.options.with_mysql
tc.variables["WITH_POSTGRESQL"] = self.options.with_postgresql
tc.variables["WITH_BOOST"] = self.options.with_boost
tc.generate()
deps = CMakeDeps(self)
deps.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
copy(self, "LICENSE_1_0.txt", dst=os.path.join(self.package_folder, "licenses"), src=self.source_folder)
cmake = CMake(self)
cmake.install()
rmdir(self, os.path.join(self.package_folder, "lib", "cmake"))
def package_info(self):
self.cpp_info.set_property("cmake_file_name", "SOCI")
target_suffix = "" if self.options.shared else "_static"
lib_prefix = "lib" if is_msvc(self) and not self.options.shared else ""
version = Version(self.version)
lib_suffix = "_{}_{}".format(version.major, version.minor) if self.settings.os == "Windows" else ""
# soci_core
self.cpp_info.components["soci_core"].set_property("cmake_target_name", "SOCI::soci_core{}".format(target_suffix))
self.cpp_info.components["soci_core"].libs = ["{}soci_core{}".format(lib_prefix, lib_suffix)]
if self.options.with_boost:
self.cpp_info.components["soci_core"].requires.append("boost::boost")
# soci_empty
if self.options.empty:
self.cpp_info.components["soci_empty"].set_property("cmake_target_name", "SOCI::soci_empty{}".format(target_suffix))
self.cpp_info.components["soci_empty"].libs = ["{}soci_empty{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_empty"].requires = ["soci_core"]
# soci_sqlite3
if self.options.with_sqlite3:
self.cpp_info.components["soci_sqlite3"].set_property("cmake_target_name", "SOCI::soci_sqlite3{}".format(target_suffix))
self.cpp_info.components["soci_sqlite3"].libs = ["{}soci_sqlite3{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_sqlite3"].requires = ["soci_core", "sqlite3::sqlite3"]
# soci_odbc
if self.options.with_odbc:
self.cpp_info.components["soci_odbc"].set_property("cmake_target_name", "SOCI::soci_odbc{}".format(target_suffix))
self.cpp_info.components["soci_odbc"].libs = ["{}soci_odbc{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_odbc"].requires = ["soci_core"]
if self.settings.os == "Windows":
self.cpp_info.components["soci_odbc"].system_libs.append("odbc32")
else:
self.cpp_info.components["soci_odbc"].requires.append("odbc::odbc")
# soci_mysql
if self.options.with_mysql:
self.cpp_info.components["soci_mysql"].set_property("cmake_target_name", "SOCI::soci_mysql{}".format(target_suffix))
self.cpp_info.components["soci_mysql"].libs = ["{}soci_mysql{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_mysql"].requires = ["soci_core", "libmysqlclient::libmysqlclient"]
# soci_postgresql
if self.options.with_postgresql:
self.cpp_info.components["soci_postgresql"].set_property("cmake_target_name", "SOCI::soci_postgresql{}".format(target_suffix))
self.cpp_info.components["soci_postgresql"].libs = ["{}soci_postgresql{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_postgresql"].requires = ["soci_core", "libpq::libpq"]
# TODO: to remove in conan v2 once cmake_find_package* generators removed
self.cpp_info.names["cmake_find_package"] = "SOCI"
self.cpp_info.names["cmake_find_package_multi"] = "SOCI"
self.cpp_info.components["soci_core"].names["cmake_find_package"] = "soci_core{}".format(target_suffix)
self.cpp_info.components["soci_core"].names["cmake_find_package_multi"] = "soci_core{}".format(target_suffix)
if self.options.empty:
self.cpp_info.components["soci_empty"].names["cmake_find_package"] = "soci_empty{}".format(target_suffix)
self.cpp_info.components["soci_empty"].names["cmake_find_package_multi"] = "soci_empty{}".format(target_suffix)
if self.options.with_sqlite3:
self.cpp_info.components["soci_sqlite3"].names["cmake_find_package"] = "soci_sqlite3{}".format(target_suffix)
self.cpp_info.components["soci_sqlite3"].names["cmake_find_package_multi"] = "soci_sqlite3{}".format(target_suffix)
if self.options.with_odbc:
self.cpp_info.components["soci_odbc"].names["cmake_find_package"] = "soci_odbc{}".format(target_suffix)
self.cpp_info.components["soci_odbc"].names["cmake_find_package_multi"] = "soci_odbc{}".format(target_suffix)
if self.options.with_mysql:
self.cpp_info.components["soci_mysql"].names["cmake_find_package"] = "soci_mysql{}".format(target_suffix)
self.cpp_info.components["soci_mysql"].names["cmake_find_package_multi"] = "soci_mysql{}".format(target_suffix)
if self.options.with_postgresql:
self.cpp_info.components["soci_postgresql"].names["cmake_find_package"] = "soci_postgresql{}".format(target_suffix)
self.cpp_info.components["soci_postgresql"].names["cmake_find_package_multi"] = "soci_postgresql{}".format(target_suffix)

View File

@@ -1,39 +0,0 @@
From d491bf7b5040d314ffd0c6310ba01f78ff44c85e Mon Sep 17 00:00:00 2001
From: Rasmus Thomsen <rasmus.thomsen@dampsoft.de>
Date: Fri, 14 Apr 2023 09:16:29 +0200
Subject: [PATCH] Remove hardcoded INSTALL_NAME_DIR for relocatable libraries
on MacOS
---
cmake/SociBackend.cmake | 2 +-
src/core/CMakeLists.txt | 1 -
2 files changed, 1 insertion(+), 2 deletions(-)
diff --git a/cmake/SociBackend.cmake b/cmake/SociBackend.cmake
index 5d4ef0df..39fe1f77 100644
--- a/cmake/SociBackend.cmake
+++ b/cmake/SociBackend.cmake
@@ -171,7 +171,7 @@ macro(soci_backend NAME)
set_target_properties(${THIS_BACKEND_TARGET}
PROPERTIES
SOVERSION ${${PROJECT_NAME}_SOVERSION}
- INSTALL_NAME_DIR ${CMAKE_INSTALL_PREFIX}/lib)
+ )
if(APPLE)
set_target_properties(${THIS_BACKEND_TARGET}
diff --git a/src/core/CMakeLists.txt b/src/core/CMakeLists.txt
index 3e7deeae..f9eae564 100644
--- a/src/core/CMakeLists.txt
+++ b/src/core/CMakeLists.txt
@@ -59,7 +59,6 @@ if (SOCI_SHARED)
PROPERTIES
VERSION ${SOCI_VERSION}
SOVERSION ${SOCI_SOVERSION}
- INSTALL_NAME_DIR ${CMAKE_INSTALL_PREFIX}/lib
CLEAN_DIRECT_OUTPUT 1)
endif()
--
2.25.1

View File

@@ -1,24 +0,0 @@
diff --git a/cmake/SociBackend.cmake b/cmake/SociBackend.cmake
index 0a664667..3fa2ed95 100644
--- a/cmake/SociBackend.cmake
+++ b/cmake/SociBackend.cmake
@@ -31,14 +31,13 @@ macro(soci_backend_deps_found NAME DEPS SUCCESS)
if(NOT DEPEND_FOUND)
list(APPEND DEPS_NOT_FOUND ${dep})
else()
- string(TOUPPER "${dep}" DEPU)
- if( ${DEPU}_INCLUDE_DIR )
- list(APPEND DEPS_INCLUDE_DIRS ${${DEPU}_INCLUDE_DIR})
+ if( ${dep}_INCLUDE_DIR )
+ list(APPEND DEPS_INCLUDE_DIRS ${${dep}_INCLUDE_DIR})
endif()
- if( ${DEPU}_INCLUDE_DIRS )
- list(APPEND DEPS_INCLUDE_DIRS ${${DEPU}_INCLUDE_DIRS})
+ if( ${dep}_INCLUDE_DIRS )
+ list(APPEND DEPS_INCLUDE_DIRS ${${dep}_INCLUDE_DIRS})
endif()
- list(APPEND DEPS_LIBRARIES ${${DEPU}_LIBRARIES})
+ list(APPEND DEPS_LIBRARIES ${${dep}_LIBRARIES})
endif()
endforeach()

View File

@@ -1,29 +0,0 @@
#/bin/bash
RIPPLED_ROOT="../src/ripple"
echo '// For documentation please see: https://xrpl-hooks.readme.io/reference/'
echo '// Generated using generate_sfcodes.sh'
cat $RIPPLED_ROOT/protocol/impl/SField.cpp | grep -E '^CONSTRUCT_' |
sed 's/UINT16,/1,/g' |
sed 's/UINT32,/2,/g' |
sed 's/UINT64,/3,/g' |
sed 's/HASH128,/4,/g' |
sed 's/HASH256,/5,/g' |
sed 's/UINT128,/4,/g' |
sed 's/UINT256,/5,/g' |
sed 's/AMOUNT,/6,/g' |
sed 's/VL,/7,/g' |
sed 's/ACCOUNT,/8,/g' |
sed 's/OBJECT,/14,/g' |
sed 's/ARRAY,/15,/g' |
sed 's/UINT8,/16,/g' |
sed 's/HASH160,/17,/g' |
sed 's/UINT160,/17,/g' |
sed 's/PATHSET,/18,/g' |
sed 's/VECTOR256,/19,/g' |
sed 's/UINT96,/20,/g' |
sed 's/UINT192,/21,/g' |
sed 's/UINT384,/22,/g' |
sed 's/UINT512,/23,/g' |
grep -Eo '"([^"]+)", *([0-9]+), *([0-9]+)' |
sed 's/"//g' | sed 's/ *//g' | sed 's/,/ /g' |
awk '{print ("#define sf"$1" (("$2"U << 16U) + "$3"U)")}'

View File

@@ -637,55 +637,43 @@ int64_t hook(uint32_t r)
{
previous_member[0] = 'V';
for (int tbl = 1; GUARD(2), tbl <= 2; ++tbl)
for (int i = 1; GUARD(32), i < 32; ++i)
{
for (int i = 0; GUARD(66), i < 32; ++i)
previous_member[1] = i < 2 ? 'R' : i < 12 ? 'H' : 'S';
previous_member[2] =
i == 0 ? 'R' :
i == 1 ? 'D' :
i < 12 ? i - 2 :
i - 12;
uint8_t vote_key[32];
if (state(SBUF(vote_key), SBUF(previous_member)) == 32)
{
previous_member[1] = i < 2 ? 'R' : i < 12 ? 'H' : 'S';
previous_member[2] =
i == 0 ? 'R' :
i == 1 ? 'D' :
i < 12 ? i - 2 :
i - 12;
previous_member[3] = tbl;
uint8_t vote_count = 0;
uint8_t vote_key[32] = {};
uint8_t ts =
previous_member[1] == 'H' ? 32 : // hook topics are a 32 byte hook hash
previous_member[1] == 'S' ? 20 : // account topics are a 20 byte account ID
8; // reward topics are an 8 byte le xfl
uint8_t padding = 32 - ts;
if (state(vote_key + padding, ts, SBUF(previous_member)) == ts)
// find and decrement the vote counter
vote_key[0] = 'C';
vote_key[1] = previous_member[1];
vote_key[2] = previous_member[2];
if (state(&vote_count, 1, SBUF(vote_key)) == 1)
{
uint8_t vote_count = 0;
// find and decrement the vote counter
vote_key[0] = 'C';
vote_key[1] = previous_member[1];
vote_key[2] = previous_member[2];
vote_key[3] = tbl;
if (state(&vote_count, 1, SBUF(vote_key)) == 1)
// if we're down to 1 vote then delete state
if (vote_count <= 1)
{
// if we're down to 1 vote then delete state
if (vote_count <= 1)
{
ASSERT(state_set(0,0, SBUF(vote_key)) == 0);
trace_num(SBUF("Decrement vote count deleted"), vote_count);
}
else // otherwise decrement
{
vote_count--;
ASSERT(state_set(&vote_count, 1, SBUF(vote_key)) == 1);
trace_num(SBUF("Decrement vote count to"), vote_count);
}
ASSERT(state_set(0,0, SBUF(vote_key)) == 0);
trace_num(SBUF("Decrement vote count deleted"), vote_count);
}
else // otherwise decrement
{
vote_count--;
ASSERT(state_set(&vote_count, 1, SBUF(vote_key)) == 1);
trace_num(SBUF("Decrement vote count to"), vote_count);
}
// delete the vote entry
ASSERT(state_set(0,0, SBUF(previous_member)) == 0);
trace(SBUF("Vote entry deleted"), vote_key, 32, 1);
}
// delete the vote entry
ASSERT(state_set(0,0, SBUF(previous_member)) == 0);
trace(SBUF("Vote entry deleted"), vote_key, 32, 1);
}
}

View File

@@ -1,5 +1,5 @@
/**
* These are helper macros for writing hooks, all of them are optional as is including macro.h at all
* These are helper macros for writing hooks, all of them are optional as is including hookmacro.h at all
*/
#include <stdint.h>

View File

@@ -60,10 +60,7 @@
#define sfBurnedNFTokens ((2U << 16U) + 44U)
#define sfHookStateCount ((2U << 16U) + 45U)
#define sfEmitGeneration ((2U << 16U) + 46U)
#define sfLockCount ((2U << 16U) + 49U)
#define sfFirstNFTokenSequence ((2U << 16U) + 50U)
#define sfXahauActivationLgrSeq ((2U << 16U) + 96U)
#define sfImportSequence ((2U << 16U) + 97U)
#define sfLockCount ((2U << 16U) + 47U)
#define sfRewardTime ((2U << 16U) + 98U)
#define sfRewardLgrFirst ((2U << 16U) + 99U)
#define sfRewardLgrLast ((2U << 16U) + 100U)
@@ -83,15 +80,12 @@
#define sfHookInstructionCount ((3U << 16U) + 17U)
#define sfHookReturnCode ((3U << 16U) + 18U)
#define sfReferenceCount ((3U << 16U) + 19U)
#define sfTouchCount ((3U << 16U) + 97U)
#define sfAccountIndex ((3U << 16U) + 98U)
#define sfAccountCount ((3U << 16U) + 99U)
#define sfRewardAccumulator ((3U << 16U) + 100U)
#define sfEmailHash ((4U << 16U) + 1U)
#define sfTakerPaysCurrency ((17U << 16U) + 1U)
#define sfTakerPaysIssuer ((17U << 16U) + 2U)
#define sfTakerGetsCurrency ((17U << 16U) + 3U)
#define sfTakerGetsIssuer ((17U << 16U) + 4U)
#define sfTakerPaysCurrency ((10U << 16U) + 1U)
#define sfTakerPaysIssuer ((10U << 16U) + 2U)
#define sfTakerGetsCurrency ((10U << 16U) + 3U)
#define sfTakerGetsIssuer ((10U << 16U) + 4U)
#define sfLedgerHash ((5U << 16U) + 1U)
#define sfParentHash ((5U << 16U) + 2U)
#define sfTransactionHash ((5U << 16U) + 3U)
@@ -126,9 +120,6 @@
#define sfOfferID ((5U << 16U) + 34U)
#define sfEscrowID ((5U << 16U) + 35U)
#define sfURITokenID ((5U << 16U) + 36U)
#define sfGovernanceFlags ((5U << 16U) + 99U)
#define sfGovernanceMarks ((5U << 16U) + 98U)
#define sfEmittedTxnID ((5U << 16U) + 97U)
#define sfAmount ((6U << 16U) + 1U)
#define sfBalance ((6U << 16U) + 2U)
#define sfLimitAmount ((6U << 16U) + 3U)
@@ -145,9 +136,6 @@
#define sfNFTokenBrokerFee ((6U << 16U) + 19U)
#define sfHookCallbackFee ((6U << 16U) + 20U)
#define sfLockedBalance ((6U << 16U) + 21U)
#define sfBaseFeeDrops ((6U << 16U) + 22U)
#define sfReserveBaseDrops ((6U << 16U) + 23U)
#define sfReserveIncrementDrops ((6U << 16U) + 24U)
#define sfPublicKey ((7U << 16U) + 1U)
#define sfMessageKey ((7U << 16U) + 2U)
#define sfSigningPubKey ((7U << 16U) + 3U)
@@ -183,13 +171,11 @@
#define sfNFTokenMinter ((8U << 16U) + 9U)
#define sfEmitCallback ((8U << 16U) + 10U)
#define sfHookAccount ((8U << 16U) + 16U)
#define sfInform ((8U << 16U) + 99U)
#define sfIndexes ((19U << 16U) + 1U)
#define sfHashes ((19U << 16U) + 2U)
#define sfAmendments ((19U << 16U) + 3U)
#define sfNFTokenOffers ((19U << 16U) + 4U)
#define sfHookNamespaces ((19U << 16U) + 5U)
#define sfURITokenIDs ((19U << 16U) + 99U)
#define sfPaths ((18U << 16U) + 1U)
#define sfTransactionMetaData ((14U << 16U) + 2U)
#define sfCreatedNode ((14U << 16U) + 3U)
@@ -212,12 +198,6 @@
#define sfHookDefinition ((14U << 16U) + 22U)
#define sfHookParameter ((14U << 16U) + 23U)
#define sfHookGrant ((14U << 16U) + 24U)
#define sfGenesisMint ((14U << 16U) + 96U)
#define sfActiveValidator ((14U << 16U) + 95U)
#define sfImportVLKey ((14U << 16U) + 94U)
#define sfHookEmission ((14U << 16U) + 93U)
#define sfMintURIToken ((14U << 16U) + 92U)
#define sfAmountEntry ((14U << 16U) + 91U)
#define sfSigners ((15U << 16U) + 3U)
#define sfSignerEntries ((15U << 16U) + 4U)
#define sfTemplate ((15U << 16U) + 5U)
@@ -232,8 +212,4 @@
#define sfHookExecutions ((15U << 16U) + 18U)
#define sfHookParameters ((15U << 16U) + 19U)
#define sfHookGrants ((15U << 16U) + 20U)
#define sfGenesisMints ((15U << 16U) + 96U)
#define sfActiveValidators ((15U << 16U) + 95U)
#define sfImportVLKeys ((15U << 16U) + 94U)
#define sfHookEmissions ((15U << 16U) + 93U)
#define sfAmounts ((15U << 16U) + 92U)

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,4 @@
#!/bin/bash -u
# We use set -e and bash with -u to bail on first non zero exit code of any
# processes launched or upon any unbound variable.
# We use set -x to print commands before running them to help with
# debugging.
set -ex
#!/bin/bash
echo "START BUILDING (HOST)"
@@ -17,12 +12,7 @@ if [[ "$GITHUB_REPOSITORY" == "" ]]; then
BUILD_CORES=8
fi
# Ensure still works outside of GH Actions by setting these to /dev/null
# GA will run this script and then delete it at the end of the job
JOB_CLEANUP_SCRIPT=${JOB_CLEANUP_SCRIPT:-/dev/null}
NORMALIZED_WORKFLOW=$(echo "$GITHUB_WORKFLOW" | tr -c 'a-zA-Z0-9' '-')
NORMALIZED_REF=$(echo "$GITHUB_REF" | tr -c 'a-zA-Z0-9' '-')
CONTAINER_NAME="xahaud_cached_builder_${NORMALIZED_WORKFLOW}-${NORMALIZED_REF}"
CONTAINER_NAME=xahaud_cached_builder_$(echo "$GITHUB_ACTOR" | awk '{print tolower($0)}')
echo "-- BUILD CORES: $BUILD_CORES"
echo "-- GITHUB_REPOSITORY: $GITHUB_REPOSITORY"
@@ -46,8 +36,7 @@ fi
STATIC_CONTAINER=$(docker ps -a | grep $CONTAINER_NAME |wc -l)
# if [[ "$STATIC_CONTAINER" -gt "0" && "$GITHUB_REPOSITORY" != "" ]]; then
if false; then
if [[ "$STATIC_CONTAINER" -gt "0" && "$GITHUB_REPOSITORY" != "" ]]; then
echo "Static container, execute in static container to have max. cache"
docker start $CONTAINER_NAME
docker exec -i $CONTAINER_NAME /hbb_exe/activate-exec bash -x /io/build-core.sh "$GITHUB_REPOSITORY" "$GITHUB_SHA" "$BUILD_CORES" "$GITHUB_RUN_NUMBER"
@@ -65,8 +54,6 @@ else
# GH Action, runner
echo "GH Action, runner, clean & re-create create persistent container"
docker rm -f $CONTAINER_NAME
echo "echo 'Stopping container: $CONTAINER_NAME'" >> "$JOB_CLEANUP_SCRIPT"
echo "docker stop --time=15 \"$CONTAINER_NAME\" || echo 'Failed to stop container or container not running'" >> "$JOB_CLEANUP_SCRIPT"
docker run -di --user 0:$(id -g) --name $CONTAINER_NAME -v /data/builds:/data/builds -v `pwd`:/io --network host ghcr.io/foobarwidget/holy-build-box-x64 /hbb_exe/activate-exec bash
docker exec -i $CONTAINER_NAME /hbb_exe/activate-exec bash -x /io/build-full.sh "$GITHUB_REPOSITORY" "$GITHUB_SHA" "$BUILD_CORES" "$GITHUB_RUN_NUMBER"
docker stop $CONTAINER_NAME

View File

@@ -134,12 +134,8 @@ RCLConsensus::Adaptor::acquireLedger(LedgerHash const& hash)
acquiringLedger_ = hash;
app_.getJobQueue().addJob(
jtADVANCE,
"getConsensusLedger1",
[id = hash, &app = app_, this]() {
JLOG(j_.debug())
<< "JOB advanceLedger getConsensusLedger1 started";
app.getInboundLedgers().acquireAsync(
jtADVANCE, "getConsensusLedger", [id = hash, &app = app_]() {
app.getInboundLedgers().acquire(
id, 0, InboundLedger::Reason::CONSENSUS);
});
}
@@ -186,7 +182,7 @@ RCLConsensus::Adaptor::share(RCLCxTx const& tx)
if (app_.getHashRouter().shouldRelay(tx.id()))
{
JLOG(j_.debug()) << "Relaying disputed tx " << tx.id();
auto const slice = tx.tx_->slice();
auto const slice = tx.tx_.slice();
protocol::TMTransaction msg;
msg.set_rawtransaction(slice.data(), slice.size());
msg.set_status(protocol::tsNEW);
@@ -330,7 +326,7 @@ RCLConsensus::Adaptor::onClose(
tx.first->add(s);
initialSet->addItem(
SHAMapNodeType::tnTRANSACTION_NM,
make_shamapitem(tx.first->getTransactionID(), s.slice()));
SHAMapItem(tx.first->getTransactionID(), s.slice()));
}
// Add pseudo-transactions to the set
@@ -374,8 +370,7 @@ RCLConsensus::Adaptor::onClose(
RCLCensorshipDetector<TxID, LedgerIndex>::TxIDSeqVec proposed;
initialSet->visitLeaves(
[&proposed,
seq](boost::intrusive_ptr<SHAMapItem const> const& item) {
[&proposed, seq](std::shared_ptr<SHAMapItem const> const& item) {
proposed.emplace_back(item->key(), seq);
});
@@ -498,11 +493,15 @@ RCLConsensus::Adaptor::doAccept(
for (auto const& item : *result.txns.map_)
{
#ifndef DEBUG
try
{
#endif
retriableTxs.insert(
std::make_shared<STTx const>(SerialIter{item.slice()}));
JLOG(j_.debug()) << " Tx: " << item.key();
#ifndef DEBUG
}
catch (std::exception const& ex)
{
@@ -510,6 +509,7 @@ RCLConsensus::Adaptor::doAccept(
JLOG(j_.warn())
<< " Tx: " << item.key() << " throws: " << ex.what();
}
#endif
}
auto built = buildLCL(
@@ -535,7 +535,7 @@ RCLConsensus::Adaptor::doAccept(
std::vector<TxID> accepted;
result.txns.map_->visitLeaves(
[&accepted](boost::intrusive_ptr<SHAMapItem const> const& item) {
[&accepted](std::shared_ptr<SHAMapItem const> const& item) {
accepted.push_back(item->key());
});
@@ -610,7 +610,7 @@ RCLConsensus::Adaptor::doAccept(
<< "Test applying disputed transaction that did"
<< " not get in " << dispute.tx().id();
SerialIter sit(dispute.tx().tx_->slice());
SerialIter sit(dispute.tx().tx_.slice());
auto txn = std::make_shared<STTx const>(sit);
// Disputed pseudo-transactions that were not accepted

View File

@@ -42,7 +42,7 @@ public:
@param txn The transaction to wrap
*/
RCLCxTx(boost::intrusive_ptr<SHAMapItem const> txn) : tx_(std::move(txn))
RCLCxTx(SHAMapItem const& txn) : tx_{txn}
{
}
@@ -50,11 +50,11 @@ public:
ID const&
id() const
{
return tx_->key();
return tx_.key();
}
//! The SHAMapItem that represents the transaction.
boost::intrusive_ptr<SHAMapItem const> tx_;
SHAMapItem const tx_;
};
/** Represents a set of transactions in RCLConsensus.
@@ -90,7 +90,8 @@ public:
bool
insert(Tx const& t)
{
return map_->addItem(SHAMapNodeType::tnTRANSACTION_NM, t.tx_);
return map_->addItem(
SHAMapNodeType::tnTRANSACTION_NM, SHAMapItem{t.tx_});
}
/** Remove a transaction from the set.
@@ -144,7 +145,7 @@ public:
code uses the shared_ptr semantics to know whether the find
was successful and properly creates a Tx as needed.
*/
boost::intrusive_ptr<SHAMapItem const> const&
std::shared_ptr<const SHAMapItem> const&
find(Tx::ID const& entry) const
{
return map_->peekItem(entry);

View File

@@ -135,10 +135,8 @@ RCLValidationsAdaptor::acquire(LedgerHash const& hash)
Application* pApp = &app_;
app_.getJobQueue().addJob(
jtADVANCE, "getConsensusLedger2", [pApp, hash, this]() {
JLOG(j_.debug())
<< "JOB advanceLedger getConsensusLedger2 started";
pApp->getInboundLedgers().acquireAsync(
jtADVANCE, "getConsensusLedger", [pApp, hash]() {
pApp->getInboundLedgers().acquire(
hash, 0, InboundLedger::Reason::CONSENSUS);
});
return std::nullopt;
@@ -154,9 +152,7 @@ void
handleNewValidation(
Application& app,
std::shared_ptr<STValidation> const& val,
std::string const& source,
BypassAccept const bypassAccept,
std::optional<beast::Journal> j)
std::string const& source)
{
auto const& signingKey = val->getSignerPublic();
auto const& hash = val->getLedgerHash();
@@ -181,23 +177,7 @@ handleNewValidation(
if (outcome == ValStatus::current)
{
if (val->isTrusted())
{
// Was: app.getLedgerMaster().checkAccept(hash, seq);
// https://github.com/XRPLF/rippled/commit/fbbea9e6e25795a8a6bd1bf64b780771933a9579
if (bypassAccept == BypassAccept::yes)
{
assert(j.has_value());
if (j.has_value())
{
JLOG(j->trace()) << "Bypassing checkAccept for validation "
<< val->getLedgerHash();
}
}
else
{
app.getLedgerMaster().checkAccept(hash, seq);
}
}
app.getLedgerMaster().checkAccept(hash, seq);
return;
}

View File

@@ -25,16 +25,12 @@
#include <ripple/protocol/Protocol.h>
#include <ripple/protocol/RippleLedgerHash.h>
#include <ripple/protocol/STValidation.h>
#include <optional>
#include <set>
#include <vector>
namespace ripple {
class Application;
enum class BypassAccept : bool { no = false, yes };
/** Wrapper over STValidation for generic Validation code
Wraps an STValidation for compatibility with the generic validation code.
@@ -252,9 +248,7 @@ void
handleNewValidation(
Application& app,
std::shared_ptr<STValidation> const& val,
std::string const& source,
BypassAccept const bypassAccept = BypassAccept::no,
std::optional<beast::Journal> j = std::nullopt);
std::string const& source);
} // namespace ripple

View File

@@ -3,7 +3,6 @@
#include <iostream>
#include <map>
#include <memory>
#include <optional>
#include <ostream>
#include <stack>
#include <string>
@@ -272,19 +271,7 @@ check_guard(
int guard_func_idx,
int last_import_idx,
GuardLog guardLog,
std::string guardLogAccStr,
/* RH NOTE:
* rules version is a bit field, so rule update 1 is 0x01, update 2 is 0x02
* and update 3 is 0x04 ideally at rule version 3 all bits so far are set
* (0b111) so the ruleVersion = 7, however if a specific rule update must be
* rolled back due to unforeseen behaviour then this may no longer be the
* case. using a bit field here leaves us flexible to rollback changes that
* might have unforeseen consequences, without also rolling back further
* changes that are fine.
*/
uint64_t rulesVersion = 0
)
std::string guardLogAccStr)
{
#define MAX_GUARD_CALLS 1024
uint32_t guard_count = 0;
@@ -634,17 +621,11 @@ check_guard(
}
else if (fc_type == 10) // memory.copy
{
if (rulesVersion & 0x02U)
GUARD_ERROR("Memory.copy instruction is not allowed.");
REQUIRE(2);
ADVANCE(2);
}
else if (fc_type == 11) // memory.fill
{
if (rulesVersion & 0x02U)
GUARD_ERROR("Memory.fill instruction is not allowed.");
ADVANCE(1);
}
else if (fc_type <= 7) // numeric instructions
@@ -826,15 +807,6 @@ validateGuards(
std::vector<uint8_t> const& wasm,
GuardLog guardLog,
std::string guardLogAccStr,
/* RH NOTE:
* rules version is a bit field, so rule update 1 is 0x01, update 2 is 0x02
* and update 3 is 0x04 ideally at rule version 3 all bits so far are set
* (0b111) so the ruleVersion = 7, however if a specific rule update must be
* rolled back due to unforeseen behaviour then this may no longer be the
* case. using a bit field here leaves us flexible to rollback changes that
* might have unforeseen consequences, without also rolling back further
* changes that are fine.
*/
uint64_t rulesVersion = 0)
{
uint64_t byteCount = wasm.size();
@@ -1505,8 +1477,7 @@ validateGuards(
guard_import_number,
last_import_number,
guardLog,
guardLogAccStr,
rulesVersion);
guardLogAccStr);
if (!valid)
return {};

View File

@@ -428,12 +428,6 @@ namespace hook {
bool
canHook(ripple::TxType txType, ripple::uint256 hookOn);
bool
canEmit(ripple::TxType txType, ripple::uint256 hookCanEmit);
ripple::uint256
getHookCanEmit(ripple::STObject const& hookObj, SLE::pointer const& hookDef);
struct HookResult;
HookResult
@@ -442,7 +436,6 @@ apply(
used for caching (one day) */
ripple::uint256 const&
hookHash, /* hash of the actual hook byte code, used for metadata */
ripple::uint256 const& hookCanEmit,
ripple::uint256 const& hookNamespace,
ripple::Blob const& wasm,
std::map<
@@ -479,7 +472,6 @@ struct HookResult
{
ripple::uint256 const hookSetTxnID;
ripple::uint256 const hookHash;
ripple::uint256 const hookCanEmit;
ripple::Keylet const accountKeylet;
ripple::Keylet const ownerDirKeylet;
ripple::Keylet const hookKeylet;

View File

@@ -79,7 +79,7 @@ main(int argc, char** argv)
close(fd);
auto result = validateGuards(hook, std::cout, "", 3);
auto result = validateGuards(hook, std::cout, "", 1);
if (!result)
{

View File

@@ -342,7 +342,8 @@ getTransactionalStakeHolders(STTx const& tx, ReadView const& rv)
case ttOFFER_CANCEL:
case ttTICKET_CREATE:
case ttHOOK_SET:
case ttOFFER_CREATE: {
case ttOFFER_CREATE: // this is handled seperately
{
break;
}
@@ -495,12 +496,6 @@ getTransactionalStakeHolders(STTx const& tx, ReadView const& rv)
break;
}
case ttCLAWBACK: {
auto const amount = tx.getFieldAmount(sfAmount);
ADD_TSH(amount.getIssuer(), tshWEAK);
break;
}
default:
return {};
}
@@ -1033,29 +1028,6 @@ hook::canHook(ripple::TxType txType, ripple::uint256 hookOn)
return (hookOn & UINT256_BIT[txType]) != beast::zero;
}
bool
hook::canEmit(ripple::TxType txType, ripple::uint256 hookCanEmit)
{
return hook::canHook(txType, hookCanEmit);
}
ripple::uint256
hook::getHookCanEmit(
ripple::STObject const& hookObj,
SLE::pointer const& hookDef)
{
// default allows all transaction types
uint256 defaultHookCanEmit = UINT256_BIT[ttHOOK_SET];
uint256 hookCanEmit =
(hookObj.isFieldPresent(sfHookCanEmit)
? hookObj.getFieldH256(sfHookCanEmit)
: hookDef->isFieldPresent(sfHookCanEmit)
? hookDef->getFieldH256(sfHookCanEmit)
: defaultHookCanEmit);
return hookCanEmit;
}
// Update HookState ledger objects for the hook... only called after accept()
// assumes the specified acc has already been checked for authoriation (hook
// grants)
@@ -1207,7 +1179,6 @@ hook::apply(
used for caching (one day) */
ripple::uint256 const&
hookHash, /* hash of the actual hook byte code, used for metadata */
ripple::uint256 const& hookCanEmit,
ripple::uint256 const& hookNamespace,
ripple::Blob const& wasm,
std::map<
@@ -1235,7 +1206,6 @@ hook::apply(
.result =
{.hookSetTxnID = hookSetTxnID,
.hookHash = hookHash,
.hookCanEmit = hookCanEmit,
.accountKeylet = keylet::account(account),
.ownerDirKeylet = keylet::ownerDir(account),
.hookKeylet = keylet::hook(account),
@@ -1247,10 +1217,9 @@ hook::apply(
.hookParamOverrides = hookParamOverrides,
.hookParams = hookParams,
.hookSkips = {},
.exitType = applyCtx.view().rules().enabled(fixXahauV3)
? hook_api::ExitType::UNSET
: hook_api::ExitType::ROLLBACK, // default is to rollback
// unless hook calls accept()
.exitType =
hook_api::ExitType::ROLLBACK, // default is to rollback unless
// hook calls accept()
.exitReason = std::string(""),
.exitCode = -1,
.hasCallback = hasCallback,
@@ -3300,16 +3269,6 @@ DEFINE_HOOK_FUNCTION(
return EMISSION_FAILURE;
}
ripple::TxType txType = stpTrans->getTxnType();
ripple::uint256 const& hookCanEmit = hookCtx.result.hookCanEmit;
if (!hook::canEmit(txType, hookCanEmit))
{
JLOG(j.trace()) << "HookEmit[" << HC_ACC()
<< "]: Hook cannot emit this txn.";
return EMISSION_FAILURE;
}
// check the emitted txn is valid
/* Emitted TXN rules
* 0. Account must match the hook account
@@ -4658,8 +4617,6 @@ DEFINE_HOOK_FUNCTION(
}
catch (std::exception& e)
{
JLOG(j.trace()) << "HookInfo[" << HC_ACC()
<< "]: etxn_fee_base exception: " << e.what();
return INVALID_TXN;
}
@@ -4831,7 +4788,7 @@ DEFINE_HOOK_FUNCTION(
if (float1 == 0)
{
j.trace() << "HookTrace[" << HC_ACC() << "]: "
j.trace() << "HookTrace[" << HC_ACC() << "]:"
<< (read_len == 0
? ""
: std::string_view(
@@ -5445,7 +5402,7 @@ DEFINE_HOOK_FUNCTION(
const int64_t float_one_internal = make_float(1000000000000000ull, -15, false);
inline int64_t
float_divide_internal(int64_t float1, int64_t float2, bool hasFix)
float_divide_internal(int64_t float1, int64_t float2)
{
RETURN_IF_INVALID_FLOAT(float1);
RETURN_IF_INVALID_FLOAT(float2);
@@ -5498,16 +5455,8 @@ float_divide_internal(int64_t float1, int64_t float2, bool hasFix)
while (man2 > 0)
{
int i = 0;
if (hasFix)
{
for (; man1 >= man2; man1 -= man2, ++i)
;
}
else
{
for (; man1 > man2; man1 -= man2, ++i)
;
}
for (; man1 > man2; man1 -= man2, ++i)
;
man3 *= 10;
man3 += i;
@@ -5527,8 +5476,7 @@ DEFINE_HOOK_FUNCTION(int64_t, float_divide, int64_t float1, int64_t float2)
HOOK_SETUP(); // populates memory_ctx, memory, memory_length, applyCtx,
// hookCtx on current stack
bool const hasFix = view.rules().enabled(fixFloatDivide);
return float_divide_internal(float1, float2, hasFix);
return float_divide_internal(float1, float2);
HOOK_TEARDOWN();
}
@@ -5547,9 +5495,7 @@ DEFINE_HOOK_FUNCTION(int64_t, float_invert, int64_t float1)
return DIVISION_BY_ZERO;
if (float1 == float_one_internal)
return float_one_internal;
bool const fixV3 = view.rules().enabled(fixFloatDivide);
return float_divide_internal(float_one_internal, float1, fixV3);
return float_divide_internal(float_one_internal, float1);
HOOK_TEARDOWN();
}

File diff suppressed because it is too large Load Diff

View File

@@ -38,21 +38,10 @@ public:
virtual ~InboundLedgers() = default;
// VFALCO TODO Should this be called findOrAdd ?
// Callers should use this if they possibly need an authoritative
// response immediately.
//
virtual std::shared_ptr<Ledger const>
acquire(uint256 const& hash, std::uint32_t seq, InboundLedger::Reason) = 0;
// Callers should use this if they are known to be executing on the Job
// Queue. TODO review whether all callers of acquire() can use this
// instead. Inbound ledger acquisition is asynchronous anyway.
virtual void
acquireAsync(
uint256 const& hash,
std::uint32_t seq,
InboundLedger::Reason reason) = 0;
virtual std::shared_ptr<InboundLedger>
find(LedgerHash const& hash) = 0;

View File

@@ -119,8 +119,9 @@ public:
sles_type::value_type
dereference() const override
{
SerialIter sit(iter_->slice());
return std::make_shared<SLE const>(sit, iter_->key());
auto const item = *iter_;
SerialIter sit(item.slice());
return std::make_shared<SLE const>(sit, item.key());
}
};
@@ -167,7 +168,7 @@ public:
txs_type::value_type
dereference() const override
{
auto const& item = *iter_;
auto const item = *iter_;
if (metadata_)
return deserializeTxPlusMeta(item);
return {deserializeTx(item), nullptr};
@@ -182,8 +183,8 @@ Ledger::Ledger(
std::vector<uint256> const& amendments,
Family& family)
: mImmutable(false)
, txMap_(SHAMapType::TRANSACTION, family)
, stateMap_(SHAMapType::STATE, family)
, txMap_(std::make_shared<SHAMap>(SHAMapType::TRANSACTION, family))
, stateMap_(std::make_shared<SHAMap>(SHAMapType::STATE, family))
, rules_{config.features}
, j_(beast::Journal(beast::Journal::getNullSink()))
{
@@ -246,7 +247,7 @@ Ledger::Ledger(
rawInsert(sle);
}
stateMap_.flushDirty(hotACCOUNT_NODE);
stateMap_->flushDirty(hotACCOUNT_NODE);
setImmutable();
}
@@ -258,8 +259,12 @@ Ledger::Ledger(
Family& family,
beast::Journal j)
: mImmutable(true)
, txMap_(SHAMapType::TRANSACTION, info.txHash, family)
, stateMap_(SHAMapType::STATE, info.accountHash, family)
, txMap_(std::make_shared<SHAMap>(
SHAMapType::TRANSACTION,
info.txHash,
family))
, stateMap_(
std::make_shared<SHAMap>(SHAMapType::STATE, info.accountHash, family))
, rules_(config.features)
, info_(info)
, j_(j)
@@ -267,7 +272,7 @@ Ledger::Ledger(
loaded = true;
if (info_.txHash.isNonZero() &&
!txMap_.fetchRoot(SHAMapHash{info_.txHash}, nullptr))
!txMap_->fetchRoot(SHAMapHash{info_.txHash}, nullptr))
{
if (config.reporting())
{
@@ -279,7 +284,7 @@ Ledger::Ledger(
}
if (info_.accountHash.isNonZero() &&
!stateMap_.fetchRoot(SHAMapHash{info_.accountHash}, nullptr))
!stateMap_->fetchRoot(SHAMapHash{info_.accountHash}, nullptr))
{
if (config.reporting())
{
@@ -290,8 +295,8 @@ Ledger::Ledger(
JLOG(j.warn()) << "Don't have state data root for ledger" << info_.seq;
}
txMap_.setImmutable();
stateMap_.setImmutable();
txMap_->setImmutable();
stateMap_->setImmutable();
defaultFees(config);
if (!setup())
@@ -305,25 +310,13 @@ Ledger::Ledger(
}
}
Ledger::Ledger(
LedgerInfo& info,
Config const& config,
Family& family,
SHAMap const& baseState)
: mImmutable(false)
, txMap_(SHAMapType::TRANSACTION, family)
, stateMap_(baseState, true)
, rules_{config.features}
, info_(info)
, j_(beast::Journal(beast::Journal::getNullSink()))
{
}
// Create a new ledger that follows this one
Ledger::Ledger(Ledger const& prevLedger, NetClock::time_point closeTime)
: mImmutable(false)
, txMap_(SHAMapType::TRANSACTION, prevLedger.txMap_.family())
, stateMap_(prevLedger.stateMap_, true)
, txMap_(std::make_shared<SHAMap>(
SHAMapType::TRANSACTION,
prevLedger.stateMap_->family()))
, stateMap_(prevLedger.stateMap_->snapShot(true))
, fees_(prevLedger.fees_)
, rules_(prevLedger.rules_)
, j_(beast::Journal(beast::Journal::getNullSink()))
@@ -352,8 +345,12 @@ Ledger::Ledger(Ledger const& prevLedger, NetClock::time_point closeTime)
Ledger::Ledger(LedgerInfo const& info, Config const& config, Family& family)
: mImmutable(true)
, txMap_(SHAMapType::TRANSACTION, info.txHash, family)
, stateMap_(SHAMapType::STATE, info.accountHash, family)
, txMap_(std::make_shared<SHAMap>(
SHAMapType::TRANSACTION,
info.txHash,
family))
, stateMap_(
std::make_shared<SHAMap>(SHAMapType::STATE, info.accountHash, family))
, rules_{config.features}
, info_(info)
, j_(beast::Journal(beast::Journal::getNullSink()))
@@ -367,8 +364,8 @@ Ledger::Ledger(
Config const& config,
Family& family)
: mImmutable(false)
, txMap_(SHAMapType::TRANSACTION, family)
, stateMap_(SHAMapType::STATE, family)
, txMap_(std::make_shared<SHAMap>(SHAMapType::TRANSACTION, family))
, stateMap_(std::make_shared<SHAMap>(SHAMapType::STATE, family))
, rules_{config.features}
, j_(beast::Journal(beast::Journal::getNullSink()))
{
@@ -386,32 +383,19 @@ Ledger::setImmutable(bool rehash)
// place the hash transitions to valid
if (!mImmutable && rehash)
{
info_.txHash = txMap_.getHash().as_uint256();
info_.accountHash = stateMap_.getHash().as_uint256();
info_.txHash = txMap_->getHash().as_uint256();
info_.accountHash = stateMap_->getHash().as_uint256();
}
if (rehash)
info_.hash = calculateLedgerHash(info_);
mImmutable = true;
txMap_.setImmutable();
stateMap_.setImmutable();
txMap_->setImmutable();
stateMap_->setImmutable();
setup();
}
// raw setters for catalogue
void
Ledger::setCloseFlags(int closeFlags)
{
info_.closeFlags = closeFlags;
}
void
Ledger::setDrops(uint64_t drops)
{
info_.drops = drops;
}
void
Ledger::setAccepted(
NetClock::time_point closeTime,
@@ -431,8 +415,8 @@ bool
Ledger::addSLE(SLE const& sle)
{
auto const s = sle.getSerializer();
return stateMap_.addItem(
SHAMapNodeType::tnACCOUNT_STATE, make_shamapitem(sle.key(), s.slice()));
SHAMapItem item(sle.key(), s.slice());
return stateMap_->addItem(SHAMapNodeType::tnACCOUNT_STATE, std::move(item));
}
//------------------------------------------------------------------------------
@@ -467,20 +451,20 @@ bool
Ledger::exists(Keylet const& k) const
{
// VFALCO NOTE Perhaps check the type for debug builds?
return stateMap_.hasItem(k.key);
return stateMap_->hasItem(k.key);
}
bool
Ledger::exists(uint256 const& key) const
{
return stateMap_.hasItem(key);
return stateMap_->hasItem(key);
}
std::optional<uint256>
Ledger::succ(uint256 const& key, std::optional<uint256> const& last) const
{
auto item = stateMap_.upper_bound(key);
if (item == stateMap_.end())
auto item = stateMap_->upper_bound(key);
if (item == stateMap_->end())
return std::nullopt;
if (last && item->key() >= last)
return std::nullopt;
@@ -495,7 +479,7 @@ Ledger::read(Keylet const& k) const
assert(false);
return nullptr;
}
auto const& item = stateMap_.peekItem(k.key);
auto const& item = stateMap_->peekItem(k.key);
if (!item)
return nullptr;
auto sle = std::make_shared<SLE>(SerialIter{item->slice()}, item->key());
@@ -509,44 +493,45 @@ Ledger::read(Keylet const& k) const
auto
Ledger::slesBegin() const -> std::unique_ptr<sles_type::iter_base>
{
return std::make_unique<sles_iter_impl>(stateMap_.begin());
return std::make_unique<sles_iter_impl>(stateMap_->begin());
}
auto
Ledger::slesEnd() const -> std::unique_ptr<sles_type::iter_base>
{
return std::make_unique<sles_iter_impl>(stateMap_.end());
return std::make_unique<sles_iter_impl>(stateMap_->end());
}
auto
Ledger::slesUpperBound(uint256 const& key) const
-> std::unique_ptr<sles_type::iter_base>
{
return std::make_unique<sles_iter_impl>(stateMap_.upper_bound(key));
return std::make_unique<sles_iter_impl>(stateMap_->upper_bound(key));
}
auto
Ledger::txsBegin() const -> std::unique_ptr<txs_type::iter_base>
{
return std::make_unique<txs_iter_impl>(!open(), txMap_.begin());
return std::make_unique<txs_iter_impl>(!open(), txMap_->begin());
}
auto
Ledger::txsEnd() const -> std::unique_ptr<txs_type::iter_base>
{
return std::make_unique<txs_iter_impl>(!open(), txMap_.end());
return std::make_unique<txs_iter_impl>(!open(), txMap_->end());
}
bool
Ledger::txExists(uint256 const& key) const
{
return txMap_.hasItem(key);
return txMap_->hasItem(key);
}
auto
Ledger::txRead(key_type const& key) const -> tx_type
{
auto const& item = txMap_.peekItem(key);
assert(txMap_);
auto const& item = txMap_->peekItem(key);
if (!item)
return {};
if (!open())
@@ -563,7 +548,7 @@ Ledger::digest(key_type const& key) const -> std::optional<digest_type>
SHAMapHash digest;
// VFALCO Unfortunately this loads the item
// from the NodeStore needlessly.
if (!stateMap_.peekItem(key, digest))
if (!stateMap_->peekItem(key, digest))
return std::nullopt;
return digest.as_uint256();
}
@@ -573,14 +558,14 @@ Ledger::digest(key_type const& key) const -> std::optional<digest_type>
void
Ledger::rawErase(std::shared_ptr<SLE> const& sle)
{
if (!stateMap_.delItem(sle->key()))
if (!stateMap_->delItem(sle->key()))
LogicError("Ledger::rawErase: key not found");
}
void
Ledger::rawErase(uint256 const& key)
{
if (!stateMap_.delItem(key))
if (!stateMap_->delItem(key))
LogicError("Ledger::rawErase: key not found");
}
@@ -589,9 +574,9 @@ Ledger::rawInsert(std::shared_ptr<SLE> const& sle)
{
Serializer ss;
sle->add(ss);
if (!stateMap_.addGiveItem(
if (!stateMap_->addGiveItem(
SHAMapNodeType::tnACCOUNT_STATE,
make_shamapitem(sle->key(), ss.slice())))
std::make_shared<SHAMapItem const>(sle->key(), ss.slice())))
LogicError("Ledger::rawInsert: key already exists");
}
@@ -600,9 +585,9 @@ Ledger::rawReplace(std::shared_ptr<SLE> const& sle)
{
Serializer ss;
sle->add(ss);
if (!stateMap_.updateGiveItem(
if (!stateMap_->updateGiveItem(
SHAMapNodeType::tnACCOUNT_STATE,
make_shamapitem(sle->key(), ss.slice())))
std::make_shared<SHAMapItem const>(sle->key(), ss.slice())))
LogicError("Ledger::rawReplace: key not found");
}
@@ -618,8 +603,9 @@ Ledger::rawTxInsert(
Serializer s(txn->getDataLength() + metaData->getDataLength() + 16);
s.addVL(txn->peekData());
s.addVL(metaData->peekData());
if (!txMap_.addGiveItem(
SHAMapNodeType::tnTRANSACTION_MD, make_shamapitem(key, s.slice())))
if (!txMap().addGiveItem(
SHAMapNodeType::tnTRANSACTION_MD,
std::make_shared<SHAMapItem const>(key, s.slice())))
LogicError("duplicate_tx: " + to_string(key));
}
@@ -635,9 +621,9 @@ Ledger::rawTxInsertWithHash(
Serializer s(txn->getDataLength() + metaData->getDataLength() + 16);
s.addVL(txn->peekData());
s.addVL(metaData->peekData());
auto item = make_shamapitem(key, s.slice());
auto item = std::make_shared<SHAMapItem const>(key, s.slice());
auto hash = sha512Half(HashPrefix::txNode, item->slice(), item->key());
if (!txMap_.addGiveItem(SHAMapNodeType::tnTRANSACTION_MD, std::move(item)))
if (!txMap().addGiveItem(SHAMapNodeType::tnTRANSACTION_MD, std::move(item)))
LogicError("duplicate_tx: " + to_string(key));
return hash;
@@ -737,7 +723,7 @@ Ledger::defaultFees(Config const& config)
std::shared_ptr<SLE>
Ledger::peek(Keylet const& k) const
{
auto const& value = stateMap_.peekItem(k.key);
auto const& value = stateMap_->peekItem(k.key);
if (!value)
return nullptr;
auto sle = std::make_shared<SLE>(SerialIter{value->slice()}, value->key());
@@ -859,8 +845,8 @@ Ledger::walkLedger(beast::Journal j, bool parallel) const
std::vector<SHAMapMissingNode> missingNodes1;
std::vector<SHAMapMissingNode> missingNodes2;
if (stateMap_.getHash().isZero() && !info_.accountHash.isZero() &&
!stateMap_.fetchRoot(SHAMapHash{info_.accountHash}, nullptr))
if (stateMap_->getHash().isZero() && !info_.accountHash.isZero() &&
!stateMap_->fetchRoot(SHAMapHash{info_.accountHash}, nullptr))
{
missingNodes1.emplace_back(
SHAMapType::STATE, SHAMapHash{info_.accountHash});
@@ -868,9 +854,9 @@ Ledger::walkLedger(beast::Journal j, bool parallel) const
else
{
if (parallel)
return stateMap_.walkMapParallel(missingNodes1, 32);
return stateMap_->walkMapParallel(missingNodes1, 32);
else
stateMap_.walkMap(missingNodes1, 32);
stateMap_->walkMap(missingNodes1, 32);
}
if (!missingNodes1.empty())
@@ -882,15 +868,15 @@ Ledger::walkLedger(beast::Journal j, bool parallel) const
}
}
if (txMap_.getHash().isZero() && info_.txHash.isNonZero() &&
!txMap_.fetchRoot(SHAMapHash{info_.txHash}, nullptr))
if (txMap_->getHash().isZero() && info_.txHash.isNonZero() &&
!txMap_->fetchRoot(SHAMapHash{info_.txHash}, nullptr))
{
missingNodes2.emplace_back(
SHAMapType::TRANSACTION, SHAMapHash{info_.txHash});
}
else
{
txMap_.walkMap(missingNodes2, 32);
txMap_->walkMap(missingNodes2, 32);
}
if (!missingNodes2.empty())
@@ -907,9 +893,9 @@ Ledger::walkLedger(beast::Journal j, bool parallel) const
bool
Ledger::assertSensible(beast::Journal ledgerJ) const
{
if (info_.hash.isNonZero() && info_.accountHash.isNonZero() &&
(info_.accountHash == stateMap_.getHash().as_uint256()) &&
(info_.txHash == txMap_.getHash().as_uint256()))
if (info_.hash.isNonZero() && info_.accountHash.isNonZero() && stateMap_ &&
txMap_ && (info_.accountHash == stateMap_->getHash().as_uint256()) &&
(info_.txHash == txMap_->getHash().as_uint256()))
{
return true;
}
@@ -1071,14 +1057,15 @@ pendSaveValidated(
return true;
}
JobType const jobType{isCurrent ? jtPUBLEDGER : jtPUBOLDLEDGER};
char const* const jobName{
isCurrent ? "Ledger::pendSave" : "Ledger::pendOldSave"};
// See if we can use the JobQueue.
if (!isSynchronous &&
app.getJobQueue().addJob(
isCurrent ? jtPUBLEDGER : jtPUBOLDLEDGER,
std::to_string(ledger->seq()),
[&app, ledger, isCurrent]() {
saveValidatedLedger(app, ledger, isCurrent);
}))
app.getJobQueue().addJob(jobType, jobName, [&app, ledger, isCurrent]() {
saveValidatedLedger(app, ledger, isCurrent);
}))
{
return true;
}
@@ -1090,15 +1077,15 @@ pendSaveValidated(
void
Ledger::unshare() const
{
stateMap_.unshare();
txMap_.unshare();
stateMap_->unshare();
txMap_->unshare();
}
void
Ledger::invariants() const
{
stateMap_.invariants();
txMap_.invariants();
stateMap_->invariants();
txMap_->invariants();
}
//------------------------------------------------------------------------------

View File

@@ -83,10 +83,6 @@ public:
Ledger&
operator=(Ledger const&) = delete;
Ledger(Ledger&&) = delete;
Ledger&
operator=(Ledger&&) = delete;
/** Create the Genesis ledger.
The Genesis ledger contains a single account whose
@@ -121,13 +117,6 @@ public:
Family& family,
beast::Journal j);
// used when loading ledgers from catalogue files
Ledger(
LedgerInfo& info,
Config const& config,
Family& family,
SHAMap const& baseState);
/** Create a new ledger following a previous ledger
The ledger will have the sequence number that
@@ -282,12 +271,6 @@ public:
void
setImmutable(bool rehash = true);
void
setCloseFlags(int closeFlags);
void
setDrops(uint64_t drops);
bool
isImmutable() const
{
@@ -307,10 +290,10 @@ public:
void
setFull() const
{
txMap_.setFull();
txMap_.setLedgerSeq(info_.seq);
stateMap_.setFull();
stateMap_.setLedgerSeq(info_.seq);
txMap_->setFull();
stateMap_->setFull();
txMap_->setLedgerSeq(info_.seq);
stateMap_->setLedgerSeq(info_.seq);
}
void
@@ -322,25 +305,25 @@ public:
SHAMap const&
stateMap() const
{
return stateMap_;
return *stateMap_;
}
SHAMap&
stateMap()
{
return stateMap_;
return *stateMap_;
}
SHAMap const&
txMap() const
{
return txMap_;
return *txMap_;
}
SHAMap&
txMap()
{
return txMap_;
return *txMap_;
}
// returns false on error
@@ -418,11 +401,8 @@ private:
bool mImmutable;
// A SHAMap containing the transactions associated with this ledger.
SHAMap mutable txMap_;
// A SHAMap containing the state objects for this ledger.
SHAMap mutable stateMap_;
std::shared_ptr<SHAMap> txMap_;
std::shared_ptr<SHAMap> stateMap_;
// Protects fee variables
std::mutex mutable mutex_;

View File

@@ -51,9 +51,7 @@ LedgerHistory::LedgerHistory(
}
bool
LedgerHistory::insert(
std::shared_ptr<Ledger const> const& ledger,
bool validated)
LedgerHistory::insert(std::shared_ptr<Ledger const> ledger, bool validated)
{
if (!ledger->isImmutable())
LogicError("mutable Ledger in insert");
@@ -74,9 +72,12 @@ LedgerHash
LedgerHistory::getLedgerHash(LedgerIndex index)
{
std::unique_lock sl(m_ledgers_by_hash.peekMutex());
if (auto it = mLedgersByIndex.find(index); it != mLedgersByIndex.end())
auto it = mLedgersByIndex.find(index);
if (it != mLedgersByIndex.end())
return it->second;
return {};
return uint256();
}
std::shared_ptr<Ledger const>
@@ -166,19 +167,19 @@ log_metadata_difference(
uint256 const& tx,
beast::Journal j)
{
auto getMeta = [](ReadView const& ledger, uint256 const& txID) {
std::optional<TxMeta> ret;
if (auto meta = ledger.txRead(txID).second)
ret.emplace(txID, ledger.seq(), *meta);
return ret;
auto getMeta = [](ReadView const& ledger,
uint256 const& txID) -> std::shared_ptr<TxMeta> {
auto meta = ledger.txRead(txID).second;
if (!meta)
return {};
return std::make_shared<TxMeta>(txID, ledger.seq(), *meta);
};
auto validMetaData = getMeta(validLedger, tx);
auto builtMetaData = getMeta(builtLedger, tx);
assert(validMetaData != nullptr || builtMetaData != nullptr);
assert(validMetaData || builtMetaData);
if (validMetaData && builtMetaData)
if (validMetaData != nullptr && builtMetaData != nullptr)
{
auto const& validNodes = validMetaData->getNodes();
auto const& builtNodes = builtMetaData->getNodes();
@@ -279,21 +280,17 @@ log_metadata_difference(
<< validNodes.getJson(JsonOptions::none);
}
}
return;
}
if (validMetaData)
else if (validMetaData != nullptr)
{
JLOG(j.error()) << "MISMATCH on TX " << tx
<< ": Metadata Difference. Valid=\n"
<< ": Metadata Difference (built has none)\n"
<< validMetaData->getJson(JsonOptions::none);
}
if (builtMetaData)
else // builtMetaData != nullptr
{
JLOG(j.error()) << "MISMATCH on TX " << tx
<< ": Metadata Difference. Built=\n"
<< ": Metadata Difference (valid has none)\n"
<< builtMetaData->getJson(JsonOptions::none);
}
}

View File

@@ -44,7 +44,7 @@ public:
@return `true` if the ledger was already tracked
*/
bool
insert(std::shared_ptr<Ledger const> const& ledger, bool validated);
insert(std::shared_ptr<Ledger const> ledger, bool validated);
/** Get the ledgers_by_hash cache hit rate
@return the hit rate
@@ -70,6 +70,8 @@ public:
LedgerHash
getLedgerHash(LedgerIndex ledgerIndex);
/** Remove stale cache entries
*/
void
sweep()
{

View File

@@ -128,7 +128,7 @@ public:
getEarliestFetch();
bool
storeLedger(std::shared_ptr<Ledger const> ledger, bool pin = false);
storeLedger(std::shared_ptr<Ledger const> ledger);
void
setFullLedger(
@@ -152,15 +152,6 @@ public:
std::string
getCompleteLedgers();
std::string
getPinnedLedgers();
RangeSet<std::uint32_t>
getCompleteLedgersRangeSet();
RangeSet<std::uint32_t>
getPinnedLedgersRangeSet();
/** Apply held transactions to the open ledger
This is normally called as we close the ledger.
The open ledger remains open to handle new transactions
@@ -206,10 +197,7 @@ public:
getLedgerByHash(uint256 const& hash);
void
setLedgerRangePresent(
std::uint32_t minV,
std::uint32_t maxV,
bool pin = false /* if true, do not let these leaders be removed */);
setLedgerRangePresent(std::uint32_t minV, std::uint32_t maxV);
std::optional<NetClock::time_point>
getCloseTimeBySeq(LedgerIndex ledgerIndex);
@@ -382,7 +370,6 @@ private:
std::recursive_mutex mCompleteLock;
RangeSet<std::uint32_t> mCompleteLedgers;
RangeSet<std::uint32_t> mPinnedLedgers; // Track pinned ledger ranges
// Publish thread is running.
bool mAdvanceThread{false};

Some files were not shown because too many files have changed in this diff Show More