Move away from Bazel (#2202)
(for upgrading users, please see the notes at the bottom) Bazel brought a lot of nice things to the table, such as rebuilds based on content changes instead of modification times, caching of build products, detection of incorrect build rules via a sandbox, and so on. Rewriting the build in Bazel was also an opportunity to improve on the Makefile-based build we had prior, which was pretty poor: most dependencies were external or not pinned, and the build graph was poorly defined and mostly serialized. It was not uncommon for fresh checkouts to fail due to floating dependencies, or for things to break when trying to switch to an older commit. For day-to-day development, I think Bazel served us reasonably well - we could generally switch between branches while being confident that builds would be correct and reasonably fast, and not require full rebuilds (except on Windows, where the lack of a sandbox and the TS rules would cause build breakages when TS files were renamed/removed). Bazel achieves that reliability by defining rules for each programming language that define how source files should be turned into outputs. For the rules to work with Bazel's sandboxing approach, they often have to reimplement or partially bypass the standard tools that each programming language provides. The Rust rules call Rust's compiler directly for example, instead of using Cargo, and the Python rules extract each PyPi package into a separate folder that gets added to sys.path. These separate language rules allow proper declaration of inputs and outputs, and offer some advantages such as caching of build products and fine-grained dependency installation. But they also bring some downsides: - The rules don't always support use-cases/platforms that the standard language tools do, meaning they need to be patched to be used. I've had to contribute a number of patches to the Rust, Python and JS rules to unblock various issues. - The dependencies we use with each language sometimes make assumptions that do not hold in Bazel, meaning they either need to be pinned or patched, or the language rules need to be adjusted to accommodate them. I was hopeful that after the initial setup work, things would be relatively smooth-sailing. Unfortunately, that has not proved to be the case. Things frequently broke when dependencies or the language rules were updated, and I began to get frustrated at the amount of Anki development time I was instead spending on build system upkeep. It's now about 2 years since switching to Bazel, and I think it's time to cut losses, and switch to something else that's a better fit. The new build system is based on a small build tool called Ninja, and some custom Rust code in build/. This means that to build Anki, Bazel is no longer required, but Ninja and Rust need to be installed on your system. Python and Node toolchains are automatically downloaded like in Bazel. This new build system should result in faster builds in some cases: - Because we're using cargo to build now, Rust builds are able to take advantage of pipelining and incremental debug builds, which we didn't have with Bazel. It's also easier to override the default linker on Linux/macOS, which can further improve speeds. - External Rust crates are now built with opt=1, which improves performance of debug builds. - Esbuild is now used to transpile TypeScript, instead of invoking the TypeScript compiler. This results in faster builds, by deferring typechecking to test/check time, and by allowing more work to happen in parallel. As an example of the differences, when testing with the mold linker on Linux, adding a new message to tags.proto (which triggers a recompile of the bulk of the Rust and TypeScript code) results in a compile that goes from about 22s on Bazel to about 7s in the new system. With the standard linker, it's about 9s. Some other changes of note: - Our Rust workspace now uses cargo-hakari to ensure all packages agree on available features, preventing unnecessary rebuilds. - pylib/anki is now a PEP420 implicit namespace, avoiding the need to merge source files and generated files into a single folder for running. By telling VSCode about the extra search path, code completion now works with generated files without needing to symlink them into the source folder. - qt/aqt can't use PEP420 as it's difficult to get rid of aqt/__init__.py. Instead, the generated files are now placed in a separate _aqt package that's added to the path. - ts/lib is now exposed as @tslib, so the source code and generated code can be provided under the same namespace without a merging step. - MyPy and PyLint are now invoked once for the entire codebase. - dprint will be used to format TypeScript/json files in the future instead of the slower prettier (currently turned off to avoid causing conflicts). It can automatically defer to prettier when formatting Svelte files. - svelte-check is now used for typechecking our Svelte code, which revealed a few typing issues that went undetected with the old system. - The Jest unit tests now work on Windows as well. If you're upgrading from Bazel, updated usage instructions are in docs/development.md and docs/build.md. A summary of the changes: - please remove node_modules and .bazel - install rustup (https://rustup.rs/) - install rsync if not already installed (on windows, use pacman - see docs/windows.md) - install Ninja (unzip from https://github.com/ninja-build/ninja/releases/tag/v1.11.1 and place on your path, or from your distro/homebrew if it's 1.10+) - update .vscode/settings.json from .vscode.dist
This commit is contained in:
parent
88362ba1f7
commit
5e0a761b87
@ -1,2 +0,0 @@
|
||||
node_modules
|
||||
.bazel
|
47
.bazelrc
47
.bazelrc
@ -1,47 +0,0 @@
|
||||
common --enable_platform_specific_config
|
||||
common --experimental_repository_cache_hardlinks
|
||||
|
||||
# runfiles are off by default on Windows, and we need them
|
||||
build --enable_runfiles
|
||||
|
||||
# skip the slow zip step on Windows, as we have symlinks
|
||||
build:windows --build_python_zip=false
|
||||
|
||||
# record version/build hash
|
||||
build --workspace_status_command='bash ./tools/status.sh'
|
||||
|
||||
# support macOS 10.13+
|
||||
build:macos --action_env="MACOSX_DEPLOYMENT_TARGET=10.13"
|
||||
build:macos --macos_minimum_os=10.13
|
||||
|
||||
# run clippy when compiling rust in test mode
|
||||
test --aspects=@rules_rust//rust:defs.bzl%rust_clippy_aspect --output_groups=+clippy_checks
|
||||
|
||||
# print output when test fails
|
||||
test --test_output=errors
|
||||
|
||||
# stop after one test failure
|
||||
test --notest_keep_going
|
||||
|
||||
# don't add empty __init__.py files
|
||||
build --incompatible_default_to_explicit_init_py
|
||||
|
||||
# custom output for CI
|
||||
build:ci --show_timestamps --isatty=0 --color=yes --show_progress_rate_limit=5
|
||||
|
||||
# 'opt' config is an alias for building with optimizations
|
||||
build:opt -c opt
|
||||
|
||||
# the TypeScript workers on Windows choke when deps are changed while they're
|
||||
# still running, so shut them down at the end of the build.
|
||||
build:windows --worker_quit_after_build
|
||||
|
||||
# place convenience symlinks inside a single folder for easier exclusion in IDEs
|
||||
build --symlink_prefix=.bazel/
|
||||
|
||||
# if (auto-created) windows.bazelrc exists, import it
|
||||
try-import %workspace%/windows.bazelrc
|
||||
|
||||
# allow extra user customizations in a separate file
|
||||
# (see .user.bazelrc for an example)
|
||||
try-import %workspace%/user.bazelrc
|
@ -1 +0,0 @@
|
||||
5.1.1
|
@ -44,14 +44,13 @@ RUN apt-get update && apt install --yes gnupg ca-certificates && \
|
||||
portaudio19-dev \
|
||||
python3-dev \
|
||||
rsync \
|
||||
unzip \
|
||||
zstd \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
|
||||
RUN curl -L https://github.com/bazelbuild/bazelisk/releases/download/v1.7.4/bazelisk-linux-amd64 \
|
||||
-o /usr/local/bin/bazel \
|
||||
&& chmod +x /usr/local/bin/bazel
|
||||
|
||||
RUN ln -sf /usr/bin/python3 /usr/bin/python
|
||||
|
||||
RUN mkdir -p /etc/buildkite-agent/hooks && chown -R user /etc/buildkite-agent
|
||||
@ -59,6 +58,19 @@ RUN mkdir -p /etc/buildkite-agent/hooks && chown -R user /etc/buildkite-agent
|
||||
COPY buildkite.cfg /etc/buildkite-agent/buildkite-agent.cfg
|
||||
COPY environment /etc/buildkite-agent/hooks/environment
|
||||
|
||||
RUN curl -LO https://github.com/ninja-build/ninja/releases/download/v1.11.1/ninja-linux.zip \
|
||||
&& unzip ninja-linux.zip \
|
||||
&& chmod +x ninja \
|
||||
&& mv ninja /usr/bin \
|
||||
&& rm ninja-linux.zip
|
||||
|
||||
RUN mkdir /state/rust && chown user /state/rust
|
||||
|
||||
USER user
|
||||
|
||||
ENV CARGO_HOME=/state/rust/cargo
|
||||
ENV RUSTUP_HOME=/state/rust/rustup
|
||||
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain none
|
||||
|
||||
WORKDIR /code/buildkite
|
||||
ENTRYPOINT ["/usr/bin/buildkite-agent", "start"]
|
||||
|
@ -1,7 +1,7 @@
|
||||
FROM debian:11-slim
|
||||
|
||||
ARG DEBIAN_FRONTEND="noninteractive"
|
||||
ENV PYTHON_SITE_PACKAGES=/usr/lib/python3/dist-packages/
|
||||
ENV PYTHONPATH=/usr/lib/python3/dist-packages
|
||||
|
||||
RUN useradd -d /state -m -u 998 user
|
||||
|
||||
@ -46,13 +46,13 @@ RUN apt-get update && apt install --yes gnupg ca-certificates && \
|
||||
python3-dev \
|
||||
rsync \
|
||||
# -- begin only required for arm64/debian11
|
||||
ninja-build \
|
||||
clang-format \
|
||||
python-is-python3 \
|
||||
python3-pyqt5.qtwebengine \
|
||||
# -- end only required for arm64/debian11
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
|
||||
RUN curl -L https://github.com/bazelbuild/bazelisk/releases/download/v1.10.1/bazelisk-linux-arm64 \
|
||||
-o /usr/local/bin/bazel \
|
||||
&& chmod +x /usr/local/bin/bazel
|
||||
@ -64,6 +64,13 @@ RUN mkdir -p /etc/buildkite-agent/hooks && chown -R user /etc/buildkite-agent
|
||||
COPY buildkite.cfg /etc/buildkite-agent/buildkite-agent.cfg
|
||||
COPY environment /etc/buildkite-agent/hooks/environment
|
||||
|
||||
RUN mkdir /state/rust && chown user /state/rust
|
||||
|
||||
USER user
|
||||
|
||||
ENV CARGO_HOME=/state/rust/cargo
|
||||
ENV RUSTUP_HOME=/state/rust/rustup
|
||||
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain none
|
||||
|
||||
WORKDIR /code/buildkite
|
||||
ENTRYPOINT ["/usr/bin/buildkite-agent", "start"]
|
||||
|
@ -4,3 +4,4 @@ build-path="/state/build"
|
||||
hooks-path="/etc/buildkite-agent/hooks"
|
||||
no-plugins=true
|
||||
no-local-hooks=true
|
||||
no-git-submodules=true
|
||||
|
@ -6,19 +6,11 @@ set -e
|
||||
echo "--- Checking CONTRIBUTORS"
|
||||
.buildkite/linux/check_contributors
|
||||
|
||||
BAZEL="bazel --output_user_root=/state/bazel --output_base=/state/bazel/anki"
|
||||
BUILDARGS="--config=ci --disk_cache=/state/bazel/disk --repository_cache=/state/bazel/repo"
|
||||
|
||||
echo "+++ Building and testing"
|
||||
ln -sf out/node_modules .
|
||||
|
||||
# move existing node_modules into tree
|
||||
test -e /state/node_modules && mv /state/node_modules .
|
||||
export PATH="$PATH:/state/rust/cargo/bin"
|
||||
export BUILD_ROOT=/state/build
|
||||
export ONLINE_TESTS=1
|
||||
|
||||
$BAZEL test $BUILDARGS ... //rslib/linkchecker
|
||||
|
||||
echo "--- Running lints"
|
||||
python tools/copyright_headers.py
|
||||
|
||||
echo "--- Cleanup"
|
||||
# if tests succeed, back up node_modules folder
|
||||
mv node_modules /state/
|
||||
./ninja pylib/anki qt/aqt check
|
||||
|
@ -2,17 +2,13 @@
|
||||
|
||||
set -e
|
||||
|
||||
# move existing node_modules into tree
|
||||
test -e /state/node_modules && mv /state/node_modules .
|
||||
export PATH="$PATH:/state/rust/cargo/bin"
|
||||
export BUILD_ROOT=/state/build
|
||||
export RELEASE=1
|
||||
ln -sf out/node_modules .
|
||||
|
||||
if [ $(uname -m) = "aarch64" ]; then
|
||||
./tools/build
|
||||
./ninja wheels:anki
|
||||
else
|
||||
./tools/bundle
|
||||
./ninja bundle
|
||||
fi
|
||||
|
||||
rm -rf /state/dist
|
||||
mv .bazel/out/dist /state
|
||||
|
||||
# if tests succeed, back up node_modules folder
|
||||
mv node_modules /state/
|
||||
|
@ -4,18 +4,7 @@ set -e
|
||||
|
||||
STATE=$(pwd)/../state/anki-ci
|
||||
mkdir -p $STATE
|
||||
BAZEL="bazel --output_user_root=$STATE/bazel --output_base=$STATE/bazel/anki"
|
||||
BUILDARGS="--config=ci --experimental_convenience_symlinks=ignore"
|
||||
|
||||
echo "+++ Building and testing"
|
||||
|
||||
# move existing node_modules into tree
|
||||
test -e $STATE/node_modules && mv $STATE/node_modules .
|
||||
|
||||
$BAZEL test $BUILDARGS ...
|
||||
|
||||
echo "--- Building wheels"
|
||||
$BAZEL build wheels
|
||||
|
||||
# if tests succeed, back up node_modules folder
|
||||
mv node_modules $STATE/
|
||||
ln -sf out/node_modules .
|
||||
BUILD_ROOT=$STATE/build ./ninja pylib/anki qt/aqt wheels check
|
||||
|
@ -1,22 +1,16 @@
|
||||
set BAZEL=\bazel\bazel.exe --output_user_root=\bazel\ankici --output_base=\bazel\ankici\base
|
||||
set BUILDARGS=--config=ci
|
||||
set PATH=c:\cargo\bin;%PATH%
|
||||
|
||||
echo +++ Building and testing
|
||||
|
||||
if exist \bazel\node_modules (
|
||||
move \bazel\node_modules .\node_modules
|
||||
if exist \buildkite\state\out (
|
||||
move \buildkite\state\out .
|
||||
)
|
||||
if exist \buildkite\state\node_modules (
|
||||
move \buildkite\state\node_modules .
|
||||
)
|
||||
|
||||
call %BAZEL% test %BUILDARGS% ...
|
||||
IF %ERRORLEVEL% NEQ 0 (
|
||||
echo checking ts build
|
||||
call %BAZEL% build //ts/... || (
|
||||
echo ts build failed, cleaning up build products
|
||||
call %BAZEL% run tools:cleanup_js
|
||||
)
|
||||
|
||||
exit /B 1
|
||||
)
|
||||
call tools\ninja build pylib/anki qt/aqt check || exit /b 1
|
||||
|
||||
echo --- Cleanup
|
||||
move node_modules \bazel\node_modules
|
||||
move out \buildkite\state\
|
||||
move node_modules \buildkite\state\
|
||||
|
9
.cargo/config.toml
Normal file
9
.cargo/config.toml
Normal file
@ -0,0 +1,9 @@
|
||||
[env]
|
||||
STRINGS_JSON = { value = "out/rslib/i18n/strings.json", relative = true }
|
||||
# build script will append .exe if necessary
|
||||
PROTOC = { value = "out/extracted/protoc/bin/protoc", relative = true }
|
||||
PYO3_NO_PYTHON = "1"
|
||||
MACOSX_DEPLOYMENT_TARGET = "10.13.4"
|
||||
|
||||
[term]
|
||||
color = "always"
|
8
.config/hakari.toml
Normal file
8
.config/hakari.toml
Normal file
@ -0,0 +1,8 @@
|
||||
hakari-package = "workspace-hack"
|
||||
dep-format-version = "2"
|
||||
resolver = "2"
|
||||
|
||||
[traversal-excludes]
|
||||
third-party = [
|
||||
{ name = "reqwest", git = "https://github.com/ankitects/reqwest.git", rev = "7591444614de02b658ddab125efba7b2bb4e2335" }
|
||||
]
|
2
.config/nextest.toml
Normal file
2
.config/nextest.toml
Normal file
@ -0,0 +1,2 @@
|
||||
[store]
|
||||
dir = "out/tests/nextest"
|
15
.dprint.json
15
.dprint.json
@ -14,7 +14,20 @@
|
||||
"semi": true
|
||||
},
|
||||
"includes": ["**/*.{ts,tsx,js,jsx,cjs,mjs,json,md,toml,svelte}"],
|
||||
"excludes": ["**/node_modules", "**/*-lock.json", "**/*.{ts,json,md,svelte}", "qt/aqt/data/web/js/vendor/*.js"],
|
||||
"excludes": [
|
||||
"**/node_modules",
|
||||
"out/**",
|
||||
"**/*-lock.json",
|
||||
"**/*.{ts,toml,svelte}",
|
||||
"qt/aqt/data/web/js/vendor/*.js",
|
||||
"ftl/qt-repo",
|
||||
"ftl/core-repo",
|
||||
"ftl/usage",
|
||||
"licenses.json",
|
||||
".dmypy.json",
|
||||
"qt/bundle/PyOxidizer",
|
||||
"target"
|
||||
],
|
||||
"plugins": [
|
||||
"https://plugins.dprint.dev/typescript-0.77.0.wasm",
|
||||
"https://plugins.dprint.dev/json-0.16.0.wasm",
|
||||
|
@ -1,4 +1,5 @@
|
||||
module.exports = {
|
||||
root: true,
|
||||
extends: ["eslint:recommended", "plugin:compat/recommended"],
|
||||
parser: "@typescript-eslint/parser",
|
||||
plugins: [
|
||||
@ -21,6 +22,7 @@ module.exports = {
|
||||
"simple-import-sort/exports": "warn",
|
||||
"prefer-const": "warn",
|
||||
"no-nested-ternary": "warn",
|
||||
"@typescript-eslint/consistent-type-imports": "error",
|
||||
},
|
||||
overrides: [
|
||||
{
|
||||
@ -43,7 +45,7 @@ module.exports = {
|
||||
},
|
||||
],
|
||||
env: { browser: true },
|
||||
ignorePatterns: ["backend_proto.d.ts", "*.svelte.d.ts"],
|
||||
ignorePatterns: ["backend_proto.d.ts", "*.svelte.d.ts", "vendor"],
|
||||
globals: {
|
||||
globalThis: false,
|
||||
NodeListOf: false,
|
10
.github/ISSUE_TEMPLATE/bug-report.md
vendored
10
.github/ISSUE_TEMPLATE/bug-report.md
vendored
@ -6,17 +6,17 @@ labels: ""
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
- Have a question or feature suggestion?
|
||||
- Problems building/running on your system?
|
||||
- Not 100% sure you've found a bug?
|
||||
- Have a question or feature suggestion?
|
||||
- Problems building/running on your system?
|
||||
- Not 100% sure you've found a bug?
|
||||
|
||||
If so, please post on https://forums.ankiweb.net/ instead. This issue tracker is
|
||||
intended primarily to track development tasks, and it is easier to provide support
|
||||
over on the forums. Please make sure you read the following pages before
|
||||
you post there:
|
||||
|
||||
- https://faqs.ankiweb.net/when-problems-occur.html
|
||||
- https://faqs.ankiweb.net/getting-help.html
|
||||
- https://faqs.ankiweb.net/when-problems-occur.html
|
||||
- https://faqs.ankiweb.net/getting-help.html
|
||||
|
||||
If you post questions, suggestions, or vague bug reports here, please do not be
|
||||
offended if we close your ticket without replying. If in doubt, please post on
|
||||
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -4,8 +4,9 @@ anki.prof
|
||||
target
|
||||
/user.bazelrc
|
||||
.dmypy.json
|
||||
node_modules
|
||||
/.idea/
|
||||
/.vscode/
|
||||
/.bazel
|
||||
/windows.bazelrc
|
||||
/out
|
||||
node_modules
|
||||
|
11
.gitmodules
vendored
Normal file
11
.gitmodules
vendored
Normal file
@ -0,0 +1,11 @@
|
||||
[submodule "ftl/core-repo"]
|
||||
path = ftl/core-repo
|
||||
url = https://github.com/ankitects/anki-core-i18n.git
|
||||
shallow = true
|
||||
[submodule "ftl/qt-repo"]
|
||||
path = ftl/qt-repo
|
||||
url = https://github.com/ankitects/anki-desktop-ftl.git
|
||||
shallow = true
|
||||
[submodule "qt/bundle/PyOxidizer"]
|
||||
path = qt/bundle/PyOxidizer
|
||||
url = https://github.com/ankitects/PyOxidizer.git
|
@ -1,10 +1,10 @@
|
||||
[settings]
|
||||
skip=aqt/forms,backend_pb2.py,backend_pb2.pyi,fluent_pb2.py,fluent_pb2.pyi,rsbackend_gen.py,generated.py,hooks_gen.py,genbackend.py
|
||||
profile=black
|
||||
multi_line_output=3
|
||||
include_trailing_comma=True
|
||||
force_grid_wrap=0
|
||||
use_parentheses=True
|
||||
line_length=88
|
||||
ensure_newline_before_comments=true
|
||||
known_first_party=tests,anki
|
||||
force_grid_wrap=0
|
||||
include_trailing_comma=True
|
||||
known_first_party=anki,aqt,tests
|
||||
line_length=88
|
||||
multi_line_output=3
|
||||
profile=black
|
||||
skip=
|
||||
use_parentheses=True
|
@ -1,25 +1,47 @@
|
||||
[mypy]
|
||||
python_version = 3.9
|
||||
pretty = false
|
||||
no_strict_optional = true
|
||||
show_error_codes = true
|
||||
check_untyped_defs = true
|
||||
disallow_untyped_decorators = True
|
||||
warn_redundant_casts = True
|
||||
warn_unused_configs = True
|
||||
check_untyped_defs = true
|
||||
disallow_untyped_defs = True
|
||||
strict_equality = true
|
||||
namespace_packages = true
|
||||
explicit_package_bases = true
|
||||
mypy_path =
|
||||
pylib,
|
||||
out/pylib,
|
||||
qt,
|
||||
out/qt,
|
||||
ftl,
|
||||
pylib/tools,
|
||||
python
|
||||
exclude = (qt/bundle/PyOxidizer|pylib/anki/_vendor)
|
||||
|
||||
[mypy-anki.*]
|
||||
disallow_untyped_defs = True
|
||||
[mypy-anki.importing.*]
|
||||
disallow_untyped_defs = False
|
||||
[mypy-anki.exporting]
|
||||
disallow_untyped_defs = False
|
||||
[mypy-aqt.operations.*]
|
||||
no_strict_optional = false
|
||||
|
||||
[mypy-aqt.winpaths]
|
||||
disallow_untyped_defs=false
|
||||
[mypy-anki.scheduler.base]
|
||||
no_strict_optional = false
|
||||
[mypy-anki._backend.rsbridge]
|
||||
ignore_missing_imports = True
|
||||
[mypy-anki._vendor.stringcase]
|
||||
disallow_untyped_defs = False
|
||||
[mypy-stringcase]
|
||||
ignore_missing_imports = True
|
||||
[mypy-aqt.mpv]
|
||||
disallow_untyped_defs=false
|
||||
ignore_errors=true
|
||||
[mypy-aqt.winpaths]
|
||||
disallow_untyped_defs=false
|
||||
|
||||
[mypy-pyaudio]
|
||||
ignore_missing_imports = True
|
||||
[mypy-win32file]
|
||||
ignore_missing_imports = True
|
||||
[mypy-win32pipe]
|
||||
@ -38,40 +60,24 @@ ignore_missing_imports = True
|
||||
ignore_missing_imports = True
|
||||
[mypy-bs4]
|
||||
ignore_missing_imports = True
|
||||
[mypy-pythoncom]
|
||||
[mypy-fluent.*]
|
||||
ignore_missing_imports = True
|
||||
[mypy-win32com]
|
||||
ignore_missing_imports = True
|
||||
[mypy-send2trash]
|
||||
ignore_missing_imports = True
|
||||
[mypy-markdown]
|
||||
ignore_missing_imports = True
|
||||
[mypy-jsonschema.*]
|
||||
ignore_missing_imports = True
|
||||
[mypy-anki._rsbridge]
|
||||
ignore_missing_imports = True
|
||||
[mypy-PyQt5.sip]
|
||||
[mypy-compare_locales.*]
|
||||
ignore_missing_imports = True
|
||||
[mypy-PyQt5.*]
|
||||
ignore_errors = True
|
||||
ignore_missing_imports = True
|
||||
[mypy-win32com.client]
|
||||
[mypy-send2trash]
|
||||
ignore_missing_imports = True
|
||||
[mypy-darkdetect]
|
||||
[mypy-win32com.*]
|
||||
ignore_missing_imports = True
|
||||
[mypy-jsonschema.*]
|
||||
ignore_missing_imports = True
|
||||
[mypy-socks]
|
||||
ignore_missing_imports = True
|
||||
[mypy-stringcase]
|
||||
[mypy-pythoncom]
|
||||
ignore_missing_imports = True
|
||||
[mypy-certifi]
|
||||
[mypy-snakeviz.*]
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-aqt.forms.*]
|
||||
disallow_untyped_defs = false
|
||||
|
||||
[mypy-anki.*]
|
||||
disallow_untyped_defs=false
|
||||
|
||||
[mypy-PyQt6.*]
|
||||
ignore_errors = True
|
||||
[mypy-wheel.*]
|
||||
ignore_missing_imports = True
|
@ -1,7 +1,8 @@
|
||||
[MASTER]
|
||||
ignore-patterns=.*_pb2.*
|
||||
persistent = no
|
||||
extension-pkg-whitelist=orjson
|
||||
extension-pkg-whitelist=orjson,PyQt6
|
||||
init-hook="import sys; sys.path.extend(['pylib/anki/_vendor', 'out/qt'])"
|
||||
|
||||
[REPORTS]
|
||||
output-format=colorized
|
||||
@ -32,6 +33,7 @@ disable=
|
||||
arguments-differ,
|
||||
arguments-renamed,
|
||||
consider-using-f-string,
|
||||
invalid-name
|
||||
|
||||
[BASIC]
|
||||
good-names =
|
||||
@ -42,4 +44,4 @@ good-names =
|
||||
ip,
|
||||
|
||||
[IMPORTS]
|
||||
ignored-modules = anki.*_pb2, anki.sync_pb2
|
||||
ignored-modules = anki.*_pb2, anki.sync_pb2, win32file,pywintypes,socket,win32pipe,winrt,pyaudio,anki.scheduler_pb2
|
@ -2,3 +2,5 @@
|
||||
# useful for manual invocation with 'cargo +nightly fmt'
|
||||
imports_granularity = "Crate"
|
||||
group_imports = "StdExternalCrate"
|
||||
# imports_granularity = "Item"
|
||||
# imports_layout = "Vertical"
|
@ -1,13 +0,0 @@
|
||||
# Copy the desired parts of this file into user.bazelrc in the repo dir
|
||||
# if you'd like to customize the build. It will be ignored by git.
|
||||
|
||||
## specify custom python path
|
||||
# build --action_env="PYO3_PYTHON=/usr/local/bin/python3.9"
|
||||
|
||||
## Cache build products for faster builds when switching between branches.
|
||||
## Is not automatically pruned, so you need to manually remove it occasionally
|
||||
## to free up disk space.
|
||||
build --disk_cache=~/.cache/bazel/disk
|
||||
|
||||
## keep Bazel server in memory for 2 days for faster builds
|
||||
startup --max_idle_secs=172800
|
@ -1,9 +1,8 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"dprint.dprint",
|
||||
"esbenp.prettier-vscode",
|
||||
"bazelbuild.vscode-bazel",
|
||||
"ms-python.python",
|
||||
"ms-python.black-formatter",
|
||||
"rust-lang.rust-analyzer",
|
||||
"svelte.svelte-vscode",
|
||||
"zxh404.vscode-proto3",
|
||||
|
@ -11,7 +11,13 @@
|
||||
"**/node_modules/*/**": true,
|
||||
".bazel/**": true
|
||||
},
|
||||
"python.analysis.extraPaths": ["./pylib"],
|
||||
"python.analysis.extraPaths": [
|
||||
"./pylib",
|
||||
"out/pylib",
|
||||
"./pylib/anki/_vendor",
|
||||
"out/qt",
|
||||
"qt"
|
||||
],
|
||||
"python.formatting.provider": "black",
|
||||
"python.linting.mypyEnabled": false,
|
||||
"python.analysis.diagnosticSeverityOverrides": {
|
||||
@ -25,7 +31,9 @@
|
||||
"rust-analyzer.rustfmt.extraArgs": ["+nightly"],
|
||||
"search.exclude": {
|
||||
"**/node_modules": true,
|
||||
".bazel/**": true
|
||||
".bazel/**": true,
|
||||
"qt/bundle/PyOxidizer": true
|
||||
},
|
||||
"rust-analyzer.cargo.buildScripts.enable": true
|
||||
"rust-analyzer.cargo.buildScripts.enable": true,
|
||||
"python.analysis.typeCheckingMode": "off"
|
||||
}
|
||||
|
39
BUILD.bazel
39
BUILD.bazel
@ -1,39 +0,0 @@
|
||||
load("@bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_tar")
|
||||
|
||||
config_setting(
|
||||
name = "release",
|
||||
values = {
|
||||
"compilation_mode": "opt",
|
||||
},
|
||||
)
|
||||
|
||||
genrule(
|
||||
name = "buildinfo",
|
||||
srcs = ["//:defs.bzl"],
|
||||
outs = ["buildinfo.txt"],
|
||||
cmd = select({
|
||||
"release": "$(location //tools:buildinfo) $(location //:defs.bzl) bazel-out/stable-status.txt release > $@",
|
||||
"//conditions:default": "$(location //tools:buildinfo) $(location //:defs.bzl) bazel-out/stable-status.txt devel > $@",
|
||||
}),
|
||||
stamp = 1,
|
||||
tools = [
|
||||
"//tools:buildinfo",
|
||||
],
|
||||
visibility = ["//visibility:public"],
|
||||
)
|
||||
|
||||
pkg_tar(
|
||||
name = "wheels",
|
||||
srcs = [
|
||||
"//pylib/anki:wheel",
|
||||
"//qt/aqt:wheel",
|
||||
],
|
||||
mode = "0644",
|
||||
tags = ["manual"],
|
||||
)
|
||||
|
||||
exports_files([
|
||||
"defs.bzl",
|
||||
"package.json",
|
||||
".prettierrc",
|
||||
])
|
879
Cargo.lock
generated
879
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
100
Cargo.toml
100
Cargo.toml
@ -1,65 +1,51 @@
|
||||
[package]
|
||||
name = "anki_workspace"
|
||||
[workspace.package]
|
||||
version = "0.0.0"
|
||||
authors = ["Ankitects Pty Ltd and contributors"]
|
||||
authors = ["Ankitects Pty Ltd and contributors <https://help.ankiweb.net>"]
|
||||
license = "AGPL-3.0-or-later"
|
||||
rust-version = "1.64"
|
||||
edition = "2021"
|
||||
|
||||
[workspace]
|
||||
members = ["rslib", "rslib/i18n", "rslib/i18n_helpers", "rslib/linkchecker", "pylib/rsbridge"]
|
||||
exclude = ["qt/bundle"]
|
||||
|
||||
[lib]
|
||||
# dummy top level for tooling
|
||||
name = "anki"
|
||||
path = "rslib/empty.rs"
|
||||
|
||||
[package.metadata.raze]
|
||||
workspace_path = "//cargo"
|
||||
package_aliases_dir = "cargo"
|
||||
rust_rules_workspace_name = "rules_rust"
|
||||
|
||||
# pull requests that add other targets (eg Arm Linux, FreeBSD) welcome - you'll
|
||||
# need to update platforms/, BUILD.request.bazel and pylib/anki/BUILD.bazel as
|
||||
# well.
|
||||
targets = [
|
||||
"x86_64-apple-darwin",
|
||||
"x86_64-apple-ios",
|
||||
"x86_64-pc-windows-msvc",
|
||||
"x86_64-unknown-linux-gnu",
|
||||
"aarch64-apple-darwin",
|
||||
"aarch64-apple-ios",
|
||||
"aarch64-apple-ios-sim",
|
||||
"aarch64-unknown-linux-gnu",
|
||||
members = [
|
||||
"rslib",
|
||||
"rslib/i18n",
|
||||
"rslib/i18n_helpers",
|
||||
"rslib/linkchecker",
|
||||
"pylib/rsbridge",
|
||||
"build/configure",
|
||||
"build/ninja_gen",
|
||||
"build/archives",
|
||||
"build/runner",
|
||||
"ftl",
|
||||
"tools/workspace-hack",
|
||||
"qt/bundle/win",
|
||||
"qt/bundle/mac",
|
||||
]
|
||||
genmode = "Remote"
|
||||
default_gen_buildrs = true
|
||||
|
||||
[package.metadata.raze.crates.pyo3.'*']
|
||||
compile_data_attr = "glob([\"**/*.md\"])"
|
||||
|
||||
[package.metadata.raze.crates.prost.'*']
|
||||
compile_data_attr = "glob([\"**/*.md\"])"
|
||||
|
||||
[package.metadata.raze.crates.ring.'*']
|
||||
compile_data_attr = "glob([\"src/**/*.der\"])"
|
||||
|
||||
[package.metadata.raze.crates.webpki.'*']
|
||||
compile_data_attr = "glob([\"src/**/*.der\"])"
|
||||
|
||||
[package.metadata.raze.crates.unic-ucd-version.'*']
|
||||
compile_data_attr = "glob([\"**/*.rsv\"])"
|
||||
|
||||
[package.metadata.raze.crates.unic-ucd-category.'*']
|
||||
compile_data_attr = "glob([\"**/*.rsv\"])"
|
||||
|
||||
[package.metadata.raze.crates.bstr.'*']
|
||||
compile_data_attr = "glob([\"**/*.dfa\"])"
|
||||
|
||||
[package.metadata.raze.crates.snafu.'*']
|
||||
compile_data_attr = "glob([\"**/*.md\"])"
|
||||
|
||||
[package.metadata.raze.crates.pyo3-build-config.'*']
|
||||
buildrs_additional_environment_variables = { "PYO3_NO_PYTHON" = "1" }
|
||||
exclude = ["qt/bundle"]
|
||||
resolver = "2"
|
||||
|
||||
[patch.crates-io]
|
||||
# If updating rev, hakari.toml needs updating too.
|
||||
reqwest = { git = "https://github.com/ankitects/reqwest.git", rev = "7591444614de02b658ddab125efba7b2bb4e2335" }
|
||||
|
||||
# Apply mild optimizations to our dependencies in dev mode, which among other things
|
||||
# improves sha2 performance by about 21x. Opt 1 chosen due to
|
||||
# https://doc.rust-lang.org/cargo/reference/profiles.html#overrides-and-generics. This
|
||||
# applies to the dependencies of unit tests as well.
|
||||
[profile.dev.package."*"]
|
||||
opt-level = 1
|
||||
debug = 0
|
||||
|
||||
[profile.dev.package.anki_i18n]
|
||||
opt-level = 1
|
||||
debug = 0
|
||||
|
||||
[profile.dev.package.runner]
|
||||
opt-level = 1
|
||||
|
||||
# Debug info off by default, which speeds up incremental builds and produces a considerably
|
||||
# smaller library.
|
||||
[profile.dev.package.anki]
|
||||
debug = 0
|
||||
[profile.dev.package.rsbridge]
|
||||
debug = 0
|
||||
|
18
WORKSPACE
18
WORKSPACE
@ -1,18 +0,0 @@
|
||||
workspace(
|
||||
name = "ankidesktop",
|
||||
managed_directories = {"@npm": [
|
||||
"node_modules",
|
||||
]},
|
||||
)
|
||||
|
||||
load(":repos.bzl", "register_repos")
|
||||
|
||||
register_repos()
|
||||
|
||||
load(":defs.bzl", "setup_deps")
|
||||
|
||||
setup_deps()
|
||||
|
||||
load(":late_deps.bzl", "setup_late_deps")
|
||||
|
||||
setup_late_deps()
|
24
build/archives/Cargo.toml
Normal file
24
build/archives/Cargo.toml
Normal file
@ -0,0 +1,24 @@
|
||||
[package]
|
||||
name = "archives"
|
||||
version = "0.0.0"
|
||||
authors = ["Ankitects Pty Ltd"]
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
camino = "1.1.1"
|
||||
flate2 = "1.0.24"
|
||||
sha2 = { version = "0.10.6" }
|
||||
tar = "0.4.38"
|
||||
tokio = { version = "1.21.2", features = ["macros", "rt-multi-thread"] }
|
||||
workspace-hack = { version = "0.1", path = "../../tools/workspace-hack" }
|
||||
xz2 = "0.1.7"
|
||||
zip = "0.6.3"
|
||||
zstd = "0.11.2"
|
||||
|
||||
[features]
|
||||
rustls = ["reqwest/rustls-tls", "reqwest/rustls-tls-native-roots"]
|
||||
native-tls = ["reqwest/native-tls"]
|
||||
|
||||
[dependencies.reqwest]
|
||||
version = "0.11.3"
|
||||
default-features = false
|
130
build/archives/src/main.rs
Normal file
130
build/archives/src/main.rs
Normal file
@ -0,0 +1,130 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{error::Error, fs, io::Read};
|
||||
|
||||
use camino::Utf8Path;
|
||||
use sha2::Digest;
|
||||
|
||||
type Result<T> = std::result::Result<T, Box<dyn Error>>;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<()> {
|
||||
let args: Vec<_> = std::env::args().collect();
|
||||
|
||||
let action = args[1].as_str();
|
||||
match action {
|
||||
"download" => {
|
||||
let archive_url = &args[2];
|
||||
let checksum = &args[3];
|
||||
let output_path = &args[4];
|
||||
download_and_check(archive_url, checksum, output_path).await?;
|
||||
}
|
||||
"extract" => {
|
||||
let archive_path = &args[2];
|
||||
let output_folder = &args[3];
|
||||
extract(archive_path, output_folder)?;
|
||||
}
|
||||
_ => panic!("unexpected action"),
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn download_and_check(archive_url: &str, checksum: &str, output_path: &str) -> Result<()> {
|
||||
let response = reqwest::get(archive_url).await?.error_for_status()?;
|
||||
let data = response.bytes().await?.to_vec();
|
||||
let mut digest = sha2::Sha256::new();
|
||||
digest.update(&data);
|
||||
let result = digest.finalize();
|
||||
let actual_checksum = format!("{:x}", result);
|
||||
if actual_checksum != checksum {
|
||||
println!("expected {checksum}, got {actual_checksum}");
|
||||
std::process::exit(1);
|
||||
}
|
||||
fs::write(output_path, data)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
enum CompressionKind {
|
||||
Zstd,
|
||||
Gzip,
|
||||
Lzma,
|
||||
/// handled by archive
|
||||
Internal,
|
||||
}
|
||||
|
||||
enum ArchiveKind {
|
||||
Tar,
|
||||
Zip,
|
||||
}
|
||||
|
||||
fn extract(archive_path: &str, output_folder: &str) -> Result<()> {
|
||||
let archive_path = Utf8Path::new(archive_path);
|
||||
let archive_filename = archive_path.file_name().unwrap();
|
||||
let mut components = archive_filename.rsplit('.');
|
||||
let last_component = components.next().unwrap();
|
||||
let (compression, archive_suffix) = match last_component {
|
||||
"zst" | "zstd" => (CompressionKind::Zstd, components.next().unwrap()),
|
||||
"gz" => (CompressionKind::Gzip, components.next().unwrap()),
|
||||
"xz" => (CompressionKind::Lzma, components.next().unwrap()),
|
||||
"tgz" => (CompressionKind::Gzip, last_component),
|
||||
"zip" => (CompressionKind::Internal, last_component),
|
||||
other => panic!("unexpected compression: {other}"),
|
||||
};
|
||||
let archive = match archive_suffix {
|
||||
"tar" | "tgz" => ArchiveKind::Tar,
|
||||
"zip" => ArchiveKind::Zip,
|
||||
other => panic!("unexpected archive kind: {other}"),
|
||||
};
|
||||
|
||||
let reader = fs::File::open(archive_path)?;
|
||||
let uncompressed_data = match compression {
|
||||
CompressionKind::Zstd => zstd::decode_all(&reader)?,
|
||||
CompressionKind::Gzip => {
|
||||
let mut buf = Vec::new();
|
||||
let mut decoder = flate2::read::GzDecoder::new(&reader);
|
||||
decoder.read_to_end(&mut buf)?;
|
||||
buf
|
||||
}
|
||||
CompressionKind::Lzma => {
|
||||
let mut buf = Vec::new();
|
||||
let mut decoder = xz2::read::XzDecoder::new(&reader);
|
||||
decoder.read_to_end(&mut buf)?;
|
||||
buf
|
||||
}
|
||||
CompressionKind::Internal => {
|
||||
vec![]
|
||||
}
|
||||
};
|
||||
|
||||
let output_folder = Utf8Path::new(output_folder);
|
||||
if output_folder.exists() {
|
||||
fs::remove_dir_all(output_folder)?;
|
||||
}
|
||||
// extract into a temporary folder
|
||||
let output_tmp =
|
||||
output_folder.with_file_name(format!("{}.tmp", output_folder.file_name().unwrap()));
|
||||
match archive {
|
||||
ArchiveKind::Tar => {
|
||||
let mut archive = tar::Archive::new(&uncompressed_data[..]);
|
||||
archive.set_preserve_mtime(false);
|
||||
archive.unpack(&output_tmp)?;
|
||||
}
|
||||
ArchiveKind::Zip => {
|
||||
let mut archive = zip::ZipArchive::new(reader)?;
|
||||
archive.extract(&output_tmp)?;
|
||||
}
|
||||
}
|
||||
// if the output folder contains a single folder (eg foo-1.2), move it up a level
|
||||
let mut entries: Vec<_> = output_tmp.read_dir_utf8()?.take(2).collect();
|
||||
let first_entry = entries.pop().unwrap()?;
|
||||
if entries.is_empty() && first_entry.metadata()?.is_dir() {
|
||||
fs::rename(first_entry.path(), output_folder)?;
|
||||
fs::remove_dir_all(output_tmp)?;
|
||||
} else {
|
||||
fs::rename(output_tmp, output_folder)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
12
build/configure/Cargo.toml
Normal file
12
build/configure/Cargo.toml
Normal file
@ -0,0 +1,12 @@
|
||||
[package]
|
||||
name = "configure"
|
||||
|
||||
version.workspace = true
|
||||
authors.workspace = true
|
||||
license.workspace = true
|
||||
edition.workspace = true
|
||||
rust-version.workspace = true
|
||||
|
||||
[dependencies]
|
||||
ninja_gen = { "path" = "../ninja_gen" }
|
||||
workspace-hack = { version = "0.1", path = "../../tools/workspace-hack" }
|
362
build/configure/src/aqt.rs
Normal file
362
build/configure/src/aqt.rs
Normal file
@ -0,0 +1,362 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use ninja_gen::{
|
||||
action::BuildAction,
|
||||
command::RunCommand,
|
||||
copy::CopyFiles,
|
||||
glob, hashmap, inputs,
|
||||
node::{CompileSass, EsbuildScript, TypescriptCheck},
|
||||
python::{python_format, PythonTest},
|
||||
Build, Result, Utf8Path, Utf8PathBuf,
|
||||
};
|
||||
|
||||
use crate::{
|
||||
anki_version,
|
||||
python::BuildWheel,
|
||||
web::{copy_mathjax, eslint},
|
||||
};
|
||||
|
||||
pub fn build_and_check_aqt(build: &mut Build) -> Result<()> {
|
||||
build_forms(build)?;
|
||||
build_generated_sources(build)?;
|
||||
build_data_folder(build)?;
|
||||
build_macos_helper(build)?;
|
||||
build_wheel(build)?;
|
||||
check_python(build)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_forms(build: &mut Build) -> Result<()> {
|
||||
let ui_files = glob!["qt/aqt/forms/*.ui"];
|
||||
let outdir = Utf8PathBuf::from("qt/_aqt/forms");
|
||||
let mut py_files = vec![];
|
||||
for path in ui_files.resolve() {
|
||||
let outpath = outdir.join(path.file_name().unwrap()).into_string();
|
||||
py_files.push(outpath.replace(".ui", "_qt5.py"));
|
||||
py_files.push(outpath.replace(".ui", "_qt6.py"));
|
||||
}
|
||||
build.add(
|
||||
"qt/aqt:forms",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $first_form",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["qt/tools/build_ui.py"],
|
||||
"" => inputs![ui_files],
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"first_form" => vec![py_files[0].as_str()],
|
||||
"" => py_files.iter().skip(1).map(|s| s.as_str()).collect(),
|
||||
},
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
/// For legacy reasons, we can not easily separate sources and generated files
|
||||
/// up with a PEP420 namespace, as aqt/__init__.py exports a bunch of things.
|
||||
/// To allow code to run/typecheck without having to merge source and generated
|
||||
/// files into a separate folder, the generated files are exported as a separate
|
||||
/// _aqt module.
|
||||
fn build_generated_sources(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"qt/aqt:hooks.py",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["qt/tools/genhooks_gui.py"],
|
||||
"" => inputs!["pylib/anki/_vendor/stringcase.py", "pylib/tools/hookslib.py"]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["qt/_aqt/hooks.py"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"qt/aqt:sass_vars",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $root_scss $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["qt/tools/extract_sass_vars.py"],
|
||||
"root_scss" => inputs![":css:_root-vars"],
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec![
|
||||
"qt/_aqt/colors.py",
|
||||
"qt/_aqt/props.py"
|
||||
]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_data_folder(build: &mut Build) -> Result<()> {
|
||||
build_css(build)?;
|
||||
build_imgs(build)?;
|
||||
build_js(build)?;
|
||||
build_pages(build)?;
|
||||
build_icons(build)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_css(build: &mut Build) -> Result<()> {
|
||||
let scss_files = build.expand_inputs(inputs![glob!["qt/aqt/data/web/css/*.scss"]]);
|
||||
let out_dir = Utf8Path::new("qt/_aqt/data/web/css");
|
||||
for scss in scss_files {
|
||||
let stem = Utf8Path::new(&scss).file_stem().unwrap();
|
||||
let mut out_path = out_dir.join(stem);
|
||||
out_path.set_extension("css");
|
||||
|
||||
build.add(
|
||||
"qt/aqt:data/web/css",
|
||||
CompileSass {
|
||||
input: scss.into(),
|
||||
output: out_path.as_str(),
|
||||
deps: inputs![":sass"],
|
||||
load_paths: vec![".", "node_modules"],
|
||||
},
|
||||
)?;
|
||||
}
|
||||
let other_ts_css = build.inputs_with_suffix(
|
||||
inputs![
|
||||
":ts:editor",
|
||||
":ts:pages:editable",
|
||||
":ts:reviewer:reviewer.css"
|
||||
],
|
||||
".css",
|
||||
);
|
||||
build.add(
|
||||
"qt/aqt:data/web/css",
|
||||
CopyFiles {
|
||||
inputs: other_ts_css.into(),
|
||||
output_folder: "qt/_aqt/data/web/css",
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
fn build_imgs(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"qt/aqt:data/web/imgs",
|
||||
CopyFiles {
|
||||
inputs: inputs![glob!["qt/aqt/data/web/imgs/*"]],
|
||||
output_folder: "qt/_aqt/data/web/imgs",
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
fn build_js(build: &mut Build) -> Result<()> {
|
||||
for ts_file in &["deckbrowser", "webview", "toolbar", "reviewer-bottom"] {
|
||||
build.add(
|
||||
"qt/aqt:data/web/js",
|
||||
EsbuildScript {
|
||||
script: "ts/transform_ts.mjs".into(),
|
||||
entrypoint: format!("qt/aqt/data/web/js/{ts_file}.ts").into(),
|
||||
deps: inputs![],
|
||||
output_stem: &format!("qt/_aqt/data/web/js/{ts_file}"),
|
||||
extra_exts: &[],
|
||||
},
|
||||
)?;
|
||||
}
|
||||
let files = inputs![glob!["qt/aqt/data/web/js/*"]];
|
||||
eslint(build, "aqt", "qt/aqt/data/web/js", files.clone())?;
|
||||
build.add(
|
||||
"check:typescript:aqt",
|
||||
TypescriptCheck {
|
||||
tsconfig: "qt/aqt/data/web/js/tsconfig.json".into(),
|
||||
inputs: files,
|
||||
},
|
||||
)?;
|
||||
let files_from_ts = build.inputs_with_suffix(
|
||||
inputs![":ts:editor", ":ts:reviewer:reviewer.js", ":ts:mathjax"],
|
||||
".js",
|
||||
);
|
||||
build.add(
|
||||
"qt/aqt:data/web/js",
|
||||
CopyFiles {
|
||||
inputs: files_from_ts.into(),
|
||||
output_folder: "qt/_aqt/data/web/js",
|
||||
},
|
||||
)?;
|
||||
build_vendor_js(build)
|
||||
}
|
||||
|
||||
fn build_vendor_js(build: &mut Build) -> Result<()> {
|
||||
build.add("qt/aqt:data/web/js/vendor:mathjax", copy_mathjax())?;
|
||||
build.add(
|
||||
"qt/aqt:data/web/js/vendor",
|
||||
CopyFiles {
|
||||
inputs: inputs![
|
||||
":node_modules:jquery",
|
||||
":node_modules:jquery-ui",
|
||||
":node_modules:css-browser-selector",
|
||||
":node_modules:bootstrap-dist"
|
||||
],
|
||||
output_folder: "qt/_aqt/data/web/js/vendor",
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
fn build_pages(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"qt/aqt:data/web/pages",
|
||||
CopyFiles {
|
||||
inputs: inputs![":ts:pages"],
|
||||
output_folder: "qt/_aqt/data/web/pages",
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_icons(build: &mut Build) -> Result<()> {
|
||||
build_themed_icons(build)?;
|
||||
build.add(
|
||||
"qt/aqt:data/qt/icons:mdi_unthemed",
|
||||
CopyFiles {
|
||||
inputs: inputs![":node_modules:mdi_unthemed"],
|
||||
output_folder: "qt/_aqt/data/qt/icons",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"qt/aqt:data/qt/icons:from_src",
|
||||
CopyFiles {
|
||||
inputs: inputs![glob!["qt/aqt/data/qt/icons/*.{png,svg}"]],
|
||||
output_folder: "qt/_aqt/data/qt/icons",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"qt/aqt:data/qt/icons",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out $in",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["qt/tools/build_qrc.py"],
|
||||
"in" => inputs![
|
||||
":qt/aqt:data/qt/icons:mdi_unthemed",
|
||||
":qt/aqt:data/qt/icons:mdi_themed",
|
||||
":qt/aqt:data/qt/icons:from_src",
|
||||
]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["qt/_aqt/data/qt/icons.qrc"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_themed_icons(build: &mut Build) -> Result<()> {
|
||||
let themed_icons_with_extra = hashmap! {
|
||||
"chevron-up" => &["FG_DISABLED"],
|
||||
"chevron-down" => &["FG_DISABLED"],
|
||||
"drag-vertical" => &["FG_SUBTLE"],
|
||||
"drag-horizontal" => &["FG_SUBTLE"],
|
||||
};
|
||||
for icon_path in build.expand_inputs(inputs![":node_modules:mdi_themed"]) {
|
||||
let path = Utf8Path::new(&icon_path);
|
||||
let stem = path.file_stem().unwrap();
|
||||
let mut colors = vec!["FG"];
|
||||
if let Some(&extra) = themed_icons_with_extra.get(stem) {
|
||||
colors.extend(extra);
|
||||
}
|
||||
build.add(
|
||||
"qt/aqt:data/qt/icons:mdi_themed",
|
||||
BuildThemedIcon {
|
||||
src_icon: path,
|
||||
colors,
|
||||
},
|
||||
)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
struct BuildThemedIcon<'a> {
|
||||
src_icon: &'a Utf8Path,
|
||||
colors: Vec<&'a str>,
|
||||
}
|
||||
|
||||
impl BuildAction for BuildThemedIcon<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$pyenv_bin $script $in $colors $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let stem = self.src_icon.file_stem().unwrap();
|
||||
// eg foo-light.svg, foo-dark.svg, foo-FG_SUBTLE-light.svg, foo-FG_SUBTLE-dark.svg
|
||||
let outputs: Vec<_> = self
|
||||
.colors
|
||||
.iter()
|
||||
.flat_map(|&color| {
|
||||
let variant = if color == "FG" {
|
||||
"".into()
|
||||
} else {
|
||||
format!("-{color}")
|
||||
};
|
||||
[
|
||||
format!("qt/_aqt/data/qt/icons/{stem}{variant}-light.svg"),
|
||||
format!("qt/_aqt/data/qt/icons/{stem}{variant}-dark.svg"),
|
||||
]
|
||||
})
|
||||
.collect();
|
||||
|
||||
build.add_inputs("pyenv_bin", inputs![":pyenv:bin"]);
|
||||
build.add_inputs("script", inputs!["qt/tools/color_svg.py"]);
|
||||
build.add_inputs("in", inputs![self.src_icon.as_str()]);
|
||||
build.add_inputs("", inputs![":qt/aqt:sass_vars"]);
|
||||
build.add_variable("colors", self.colors.join(":"));
|
||||
build.add_outputs("out", outputs);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_macos_helper(build: &mut Build) -> Result<()> {
|
||||
if cfg!(target_os = "macos") {
|
||||
build.add(
|
||||
"qt/aqt:data/lib:libankihelper",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out $in",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["qt/mac/helper_build.py"],
|
||||
"in" => inputs![glob!["qt/mac/*.swift"]],
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["qt/_aqt/data/lib/libankihelper.dylib"],
|
||||
},
|
||||
},
|
||||
)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_wheel(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"wheels:aqt",
|
||||
BuildWheel {
|
||||
name: "aqt",
|
||||
version: anki_version(),
|
||||
src_folder: "qt/aqt",
|
||||
gen_folder: "$builddir/qt/_aqt",
|
||||
platform: None,
|
||||
deps: inputs![":qt/aqt", glob!("qt/aqt/**"), "python/requirements.aqt.in"],
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
fn check_python(build: &mut Build) -> Result<()> {
|
||||
python_format(
|
||||
build,
|
||||
"qt",
|
||||
inputs![glob!("qt/**/*.py", "qt/bundle/PyOxidizer/**")],
|
||||
)?;
|
||||
|
||||
build.add(
|
||||
"check:pytest:aqt",
|
||||
PythonTest {
|
||||
folder: "qt/tests",
|
||||
python_path: &["pylib", "$builddir/pylib", "$builddir/qt"],
|
||||
deps: inputs![":pylib/anki", ":qt/aqt", glob!["qt/tests/**"]],
|
||||
},
|
||||
)
|
||||
}
|
481
build/configure/src/bundle.rs
Normal file
481
build/configure/src/bundle.rs
Normal file
@ -0,0 +1,481 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use ninja_gen::{
|
||||
action::BuildAction,
|
||||
archives::{download_and_extract, empty_manifest, with_exe, OnlineArchive},
|
||||
cargo::{CargoBuild, RustOutput},
|
||||
git::SyncSubmodule,
|
||||
glob,
|
||||
input::BuildInput,
|
||||
inputs,
|
||||
python::PythonEnvironment,
|
||||
Build, Result, Utf8Path,
|
||||
};
|
||||
|
||||
use crate::{
|
||||
anki_version,
|
||||
platform::{overriden_python_target_platform, overriden_rust_target_triple},
|
||||
};
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
enum DistKind {
|
||||
Standard,
|
||||
Alternate,
|
||||
}
|
||||
|
||||
impl DistKind {
|
||||
fn folder_name(&self) -> &'static str {
|
||||
match self {
|
||||
DistKind::Standard => "std",
|
||||
DistKind::Alternate => "alt",
|
||||
}
|
||||
}
|
||||
|
||||
fn name(&self) -> &'static str {
|
||||
match self {
|
||||
DistKind::Standard => "standard",
|
||||
DistKind::Alternate => "alternate",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn build_bundle(build: &mut Build, python_binary: &BuildInput) -> Result<()> {
|
||||
// install into venv
|
||||
setup_primary_venv(build, python_binary)?;
|
||||
install_anki_wheels(build)?;
|
||||
|
||||
// bundle venv into output binary + extra_files
|
||||
build_pyoxidizer(build)?;
|
||||
build_artifacts(build)?;
|
||||
build_binary(build)?;
|
||||
|
||||
// package up outputs with Qt/other deps
|
||||
download_dist_folder_deps(build)?;
|
||||
build_dist_folder(build, DistKind::Standard)?;
|
||||
|
||||
// repeat for Qt5
|
||||
if !targetting_macos_arm() {
|
||||
if !cfg!(target_os = "macos") {
|
||||
setup_qt5_venv(build, python_binary)?;
|
||||
}
|
||||
build_dist_folder(build, DistKind::Alternate)?;
|
||||
}
|
||||
|
||||
build_packages(build)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn targetting_macos_arm() -> bool {
|
||||
cfg!(all(target_os = "macos", target_arch = "aarch64"))
|
||||
&& overriden_python_target_platform().is_none()
|
||||
}
|
||||
|
||||
const WIN_AUDIO: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-02-09/audio-win-amd64.tar.gz",
|
||||
sha256: "0815a601baba05e03bc36b568cdc2332b1cf4aa17125fc33c69de125f8dd687f",
|
||||
};
|
||||
|
||||
const MAC_ARM_AUDIO: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-05-26/audio-mac-arm64.tar.gz",
|
||||
sha256: "f6c4af9be59ae1c82a16f5c6307f13cbf31b49ad7b69ce1cb6e0e7b403cfdb8f",
|
||||
};
|
||||
|
||||
const MAC_AMD_AUDIO: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-05-26/audio-mac-amd64.tar.gz",
|
||||
sha256: "ecbb3c878805cdd58b1a0b8e3fd8c753b8ce3ad36c8b5904a79111f9db29ff42",
|
||||
};
|
||||
|
||||
const MAC_ARM_QT6: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-10-10/pyqt6.4-mac-arm64.tar.gz",
|
||||
sha256: "96f5b3e64f3eeebbb8c60f85d547bbe21a3e8dfbc1135286fcd37482c8c4d87b",
|
||||
};
|
||||
|
||||
const MAC_AMD_QT6: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-10-10/pyqt6.4-mac-amd64.tar.gz",
|
||||
sha256: "6da02be0ffbbbdb5db80c1c65d01bdbf0207c04378019fcf6109796adc97916e",
|
||||
};
|
||||
|
||||
const MAC_AMD_QT5: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-02-09/pyqt5.14-mac-amd64.tar.gz",
|
||||
sha256: "474951bed79ddb9570ee4c5a6079041772551ea77e77171d9e33d6f5e7877ec1",
|
||||
};
|
||||
|
||||
const LINUX_QT_PLUGINS: OnlineArchive = OnlineArchive {
|
||||
url: "https://github.com/ankitects/anki-bundle-extras/releases/download/anki-2022-02-09/qt-plugins-linux-amd64.tar.gz",
|
||||
sha256: "cbfb41fb750ae19b381f8137bd307e1167fdc68420052977f6e1887537a131b0",
|
||||
};
|
||||
|
||||
fn download_dist_folder_deps(build: &mut Build) -> Result<()> {
|
||||
let mut bundle_deps = vec![":wheels"];
|
||||
if cfg!(windows) {
|
||||
download_and_extract(build, "win_amd64_audio", WIN_AUDIO, empty_manifest())?;
|
||||
bundle_deps.push(":extract:win_amd64_audio");
|
||||
} else if cfg!(target_os = "macos") {
|
||||
if targetting_macos_arm() {
|
||||
download_and_extract(build, "mac_arm_audio", MAC_ARM_AUDIO, empty_manifest())?;
|
||||
download_and_extract(build, "mac_arm_qt6", MAC_ARM_QT6, empty_manifest())?;
|
||||
bundle_deps.extend([":extract:mac_arm_audio", ":extract:mac_arm_qt6"]);
|
||||
} else {
|
||||
download_and_extract(build, "mac_amd_audio", MAC_AMD_AUDIO, empty_manifest())?;
|
||||
download_and_extract(build, "mac_amd_qt6", MAC_AMD_QT6, empty_manifest())?;
|
||||
download_and_extract(build, "mac_amd_qt5", MAC_AMD_QT5, empty_manifest())?;
|
||||
bundle_deps.extend([
|
||||
":extract:mac_amd_audio",
|
||||
":extract:mac_amd_qt6",
|
||||
":extract:mac_amd_qt5",
|
||||
]);
|
||||
}
|
||||
} else {
|
||||
download_and_extract(
|
||||
build,
|
||||
"linux_qt_plugins",
|
||||
LINUX_QT_PLUGINS,
|
||||
empty_manifest(),
|
||||
)?;
|
||||
bundle_deps.extend([":extract:linux_qt_plugins"]);
|
||||
}
|
||||
build.add_inputs_to_group(
|
||||
"bundle:deps",
|
||||
inputs![bundle_deps
|
||||
.iter()
|
||||
.map(ToString::to_string)
|
||||
.collect::<Vec<_>>()],
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
struct Venv {
|
||||
label: &'static str,
|
||||
path_without_builddir: &'static str,
|
||||
}
|
||||
|
||||
impl Venv {
|
||||
fn label_as_target(&self, suffix: &str) -> String {
|
||||
format!(":{}{suffix}", self.label)
|
||||
}
|
||||
}
|
||||
|
||||
const PRIMARY_VENV: Venv = Venv {
|
||||
label: "bundle:pyenv",
|
||||
path_without_builddir: "bundle/pyenv",
|
||||
};
|
||||
|
||||
/// Only used for copying Qt libs on Windows/Linux.
|
||||
const QT5_VENV: Venv = Venv {
|
||||
label: "bundle:pyenv-qt5",
|
||||
path_without_builddir: "bundle/pyenv-qt5",
|
||||
};
|
||||
|
||||
fn setup_primary_venv(build: &mut Build, python_binary: &BuildInput) -> Result<()> {
|
||||
let mut qt6_reqs = inputs![
|
||||
"python/requirements.bundle.txt",
|
||||
"python/requirements.qt6.txt",
|
||||
];
|
||||
if cfg!(windows) {
|
||||
qt6_reqs = inputs![qt6_reqs, "python/requirements.win.txt"];
|
||||
}
|
||||
build.add(
|
||||
PRIMARY_VENV.label,
|
||||
PythonEnvironment {
|
||||
folder: PRIMARY_VENV.path_without_builddir,
|
||||
base_requirements_txt: "python/requirements.base.txt".into(),
|
||||
requirements_txt: qt6_reqs,
|
||||
python_binary,
|
||||
extra_binary_exports: &[],
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn setup_qt5_venv(build: &mut Build, python_binary: &BuildInput) -> Result<()> {
|
||||
let qt5_reqs = inputs![
|
||||
"python/requirements.base.txt",
|
||||
if cfg!(target_os = "macos") {
|
||||
"python/requirements.qt5_14.txt"
|
||||
} else {
|
||||
"python/requirements.qt5_15.txt"
|
||||
}
|
||||
];
|
||||
build.add(
|
||||
QT5_VENV.label,
|
||||
PythonEnvironment {
|
||||
folder: QT5_VENV.path_without_builddir,
|
||||
base_requirements_txt: "python/requirements.base.txt".into(),
|
||||
requirements_txt: qt5_reqs,
|
||||
python_binary,
|
||||
extra_binary_exports: &[],
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
struct InstallAnkiWheels {
|
||||
venv: Venv,
|
||||
}
|
||||
|
||||
impl BuildAction for InstallAnkiWheels {
|
||||
fn command(&self) -> &str {
|
||||
"$pip install --force-reinstall --no-deps $in"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
build.add_inputs("pip", inputs![self.venv.label_as_target(":pip")]);
|
||||
build.add_inputs("in", inputs![":wheels"]);
|
||||
build.add_output_stamp("bundle/wheels.stamp");
|
||||
}
|
||||
}
|
||||
|
||||
fn install_anki_wheels(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"bundle:add_wheels:qt6",
|
||||
InstallAnkiWheels { venv: PRIMARY_VENV },
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_pyoxidizer(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"bundle:pyoxidizer:repo",
|
||||
SyncSubmodule {
|
||||
path: "qt/bundle/PyOxidizer",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"bundle:pyoxidizer:bin",
|
||||
CargoBuild {
|
||||
inputs: inputs![":bundle:pyoxidizer:repo", glob!["qt/bundle/PyOxidizer/**"]],
|
||||
// can't use ::Binary() here, as we're in a separate workspace
|
||||
outputs: &[RustOutput::Data(
|
||||
"bin",
|
||||
&with_exe("bundle/rust/release/pyoxidizer"),
|
||||
)],
|
||||
target: None,
|
||||
extra_args: &format!(
|
||||
"--manifest-path={} --target-dir={} -p pyoxidizer",
|
||||
"qt/bundle/PyOxidizer/Cargo.toml", "$builddir/bundle/rust"
|
||||
),
|
||||
release_override: Some(true),
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
struct BuildArtifacts {}
|
||||
|
||||
impl BuildAction for BuildArtifacts {
|
||||
fn command(&self) -> &str {
|
||||
"$runner build-artifacts $bundle_root $pyoxidizer_bin"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
build.add_inputs("pyoxidizer_bin", inputs![":bundle:pyoxidizer:bin"]);
|
||||
build.add_inputs("", inputs![PRIMARY_VENV.label_as_target("")]);
|
||||
build.add_inputs("", inputs![":bundle:add_wheels:qt6", glob!["qt/bundle/**"]]);
|
||||
build.add_variable("bundle_root", "$builddir/bundle");
|
||||
build.add_outputs_ext(
|
||||
"pyo3_config",
|
||||
vec!["bundle/artifacts/pyo3-build-config-file.txt"],
|
||||
true,
|
||||
);
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
fn build_artifacts(build: &mut Build) -> Result<()> {
|
||||
build.add("bundle:artifacts", BuildArtifacts {})
|
||||
}
|
||||
|
||||
struct BuildBundle {}
|
||||
|
||||
impl BuildAction for BuildBundle {
|
||||
fn command(&self) -> &str {
|
||||
"$runner build-bundle-binary"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
build.add_inputs("", inputs![":bundle:artifacts", glob!["qt/bundle/**"]]);
|
||||
build.add_outputs(
|
||||
"",
|
||||
vec![RustOutput::Binary("anki").path(
|
||||
Utf8Path::new("$builddir/bundle/rust"),
|
||||
overriden_rust_target_triple(),
|
||||
true,
|
||||
)],
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_binary(build: &mut Build) -> Result<()> {
|
||||
build.add("bundle:binary", BuildBundle {})
|
||||
}
|
||||
|
||||
struct BuildDistFolder {
|
||||
kind: DistKind,
|
||||
deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for BuildDistFolder {
|
||||
fn command(&self) -> &str {
|
||||
"$runner build-dist-folder $kind $out_folder "
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_variable("kind", self.kind.name());
|
||||
let folder = match self.kind {
|
||||
DistKind::Standard => "bundle/std",
|
||||
DistKind::Alternate => "bundle/alt",
|
||||
};
|
||||
build.add_outputs("out_folder", vec![folder]);
|
||||
build.add_outputs("stamp", vec![format!("{folder}.stamp")]);
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
fn build_dist_folder(build: &mut Build, kind: DistKind) -> Result<()> {
|
||||
let mut deps = inputs![":bundle:deps", ":bundle:binary", glob!["qt/bundle/**"]];
|
||||
if kind == DistKind::Alternate && !cfg!(target_os = "macos") {
|
||||
deps = inputs![deps, QT5_VENV.label_as_target("")];
|
||||
}
|
||||
let group = match kind {
|
||||
DistKind::Standard => "bundle:folder:std",
|
||||
DistKind::Alternate => "bundle:folder:alt",
|
||||
};
|
||||
build.add(group, BuildDistFolder { kind, deps })
|
||||
}
|
||||
|
||||
fn build_packages(build: &mut Build) -> Result<()> {
|
||||
if cfg!(windows) {
|
||||
build_windows_installers(build)
|
||||
} else if cfg!(target_os = "macos") {
|
||||
build_mac_app(build, DistKind::Standard)?;
|
||||
if !targetting_macos_arm() {
|
||||
build_mac_app(build, DistKind::Alternate)?;
|
||||
}
|
||||
build_dmgs(build)
|
||||
} else {
|
||||
build_tarball(build, DistKind::Standard)?;
|
||||
build_tarball(build, DistKind::Alternate)
|
||||
}
|
||||
}
|
||||
|
||||
struct BuildTarball {
|
||||
kind: DistKind,
|
||||
}
|
||||
|
||||
impl BuildAction for BuildTarball {
|
||||
fn command(&self) -> &str {
|
||||
"chmod -R a+r $folder && tar -I '$zstd' --transform $transform -cf $tarball -C $folder ."
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let input_folder_name = self.kind.folder_name();
|
||||
let input_folder_target = format!(":bundle:folder:{input_folder_name}");
|
||||
let input_folder_path = format!("$builddir/bundle/{input_folder_name}");
|
||||
|
||||
let version = anki_version();
|
||||
let qt = match self.kind {
|
||||
DistKind::Standard => "qt6",
|
||||
DistKind::Alternate => "qt5",
|
||||
};
|
||||
let output_folder_base = format!("anki-{version}-linux-{qt}");
|
||||
let output_tarball = format!("bundle/package/{output_folder_base}.tar.zst");
|
||||
|
||||
build.add_inputs("", inputs![input_folder_target]);
|
||||
build.add_variable("zstd", "zstd -c --long -T0 -18");
|
||||
build.add_variable("transform", format!("s%^.%{output_folder_base}%"));
|
||||
build.add_variable("folder", input_folder_path);
|
||||
build.add_outputs("tarball", vec![output_tarball]);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_tarball(build: &mut Build, kind: DistKind) -> Result<()> {
|
||||
let name = kind.folder_name();
|
||||
build.add(format!("bundle:package:{name}"), BuildTarball { kind })
|
||||
}
|
||||
|
||||
struct BuildWindowsInstallers {}
|
||||
|
||||
impl BuildAction for BuildWindowsInstallers {
|
||||
fn command(&self) -> &str {
|
||||
"cargo run -p makeinstall --target-dir=out/rust -- $version $src_root $bundle_root $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let version = anki_version();
|
||||
let outputs = ["qt6", "qt5"].iter().map(|qt| {
|
||||
let output_base = format!("anki-{version}-windows-{qt}");
|
||||
format!("bundle/package/{output_base}.exe")
|
||||
});
|
||||
|
||||
build.add_inputs("", inputs![":bundle:folder:std", ":bundle:folder:alt"]);
|
||||
build.add_variable("version", version);
|
||||
build.add_variable("bundle_root", "$builddir/bundle");
|
||||
build.add_outputs("out", outputs);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_windows_installers(build: &mut Build) -> Result<()> {
|
||||
build.add("bundle:package", BuildWindowsInstallers {})
|
||||
}
|
||||
|
||||
struct BuildMacApp {
|
||||
kind: DistKind,
|
||||
}
|
||||
|
||||
impl BuildAction for BuildMacApp {
|
||||
fn command(&self) -> &str {
|
||||
"cargo run -p makeapp --target-dir=out/rust -- build-app $version $kind $stamp"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let folder_name = self.kind.folder_name();
|
||||
build.add_inputs("", inputs![format!(":bundle:folder:{folder_name}")]);
|
||||
build.add_variable("version", anki_version());
|
||||
build.add_variable("kind", self.kind.name());
|
||||
build.add_outputs("stamp", vec![format!("bundle/app/{folder_name}.stamp")]);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_mac_app(build: &mut Build, kind: DistKind) -> Result<()> {
|
||||
build.add(format!("bundle:app:{}", kind.name()), BuildMacApp { kind })
|
||||
}
|
||||
|
||||
struct BuildDmgs {}
|
||||
|
||||
impl BuildAction for BuildDmgs {
|
||||
fn command(&self) -> &str {
|
||||
"cargo run -p makeapp --target-dir=out/rust -- build-dmgs $dmgs"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let version = anki_version();
|
||||
let platform = if targetting_macos_arm() {
|
||||
"apple"
|
||||
} else {
|
||||
"intel"
|
||||
};
|
||||
let qt = if targetting_macos_arm() {
|
||||
&["qt6"][..]
|
||||
} else {
|
||||
&["qt6", "qt5"]
|
||||
};
|
||||
let dmgs = qt
|
||||
.iter()
|
||||
.map(|qt| format!("bundle/dmg/anki-{version}-mac-{platform}-{qt}.dmg"));
|
||||
|
||||
build.add_inputs("", inputs![":bundle:app"]);
|
||||
build.add_outputs("dmgs", dmgs);
|
||||
}
|
||||
}
|
||||
|
||||
fn build_dmgs(build: &mut Build) -> Result<()> {
|
||||
build.add("bundle:dmg", BuildDmgs {})
|
||||
}
|
52
build/configure/src/main.rs
Normal file
52
build/configure/src/main.rs
Normal file
@ -0,0 +1,52 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
mod aqt;
|
||||
mod bundle;
|
||||
mod platform;
|
||||
mod proto;
|
||||
mod pylib;
|
||||
mod python;
|
||||
mod rust;
|
||||
mod web;
|
||||
|
||||
use aqt::build_and_check_aqt;
|
||||
use bundle::build_bundle;
|
||||
use ninja_gen::{Build, Result};
|
||||
use pylib::{build_pylib, check_pylib};
|
||||
use python::{check_copyright, check_python, setup_python, setup_venv};
|
||||
use rust::{build_rust, check_rust};
|
||||
use web::{build_and_check_web, check_sql};
|
||||
|
||||
use crate::proto::check_proto;
|
||||
|
||||
fn anki_version() -> &'static str {
|
||||
include_str!("../../../.version").trim()
|
||||
}
|
||||
|
||||
fn main() -> Result<()> {
|
||||
let mut build = Build::new()?;
|
||||
let build = &mut build;
|
||||
|
||||
let python_binary = setup_python(build)?;
|
||||
setup_venv(build, &python_binary)?;
|
||||
|
||||
build_rust(build)?;
|
||||
build_pylib(build)?;
|
||||
build_and_check_web(build)?;
|
||||
build_and_check_aqt(build)?;
|
||||
build_bundle(build, &python_binary)?;
|
||||
|
||||
check_rust(build)?;
|
||||
check_pylib(build)?;
|
||||
check_python(build)?;
|
||||
check_proto(build)?;
|
||||
check_sql(build)?;
|
||||
check_copyright(build)?;
|
||||
|
||||
build.trailing_text = "default pylib/anki qt/aqt\n".into();
|
||||
|
||||
build.write_build_file();
|
||||
|
||||
Ok(())
|
||||
}
|
22
build/configure/src/platform.rs
Normal file
22
build/configure/src/platform.rs
Normal file
@ -0,0 +1,22 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::env;
|
||||
|
||||
use ninja_gen::archives::Platform;
|
||||
|
||||
/// Usually None to use the host architecture; can be overriden by setting MAC_X86
|
||||
/// to build for x86_64 on Apple Silicon
|
||||
pub fn overriden_rust_target_triple() -> Option<&'static str> {
|
||||
overriden_python_target_platform().map(|p| p.as_rust_triple())
|
||||
}
|
||||
|
||||
/// Usually None to use the host architecture; can be overriden by setting MAC_X86
|
||||
/// to build for x86_64 on Apple Silicon
|
||||
pub fn overriden_python_target_platform() -> Option<Platform> {
|
||||
if env::var("MAC_X86").is_ok() {
|
||||
Some(Platform::MacX64)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
39
build/configure/src/proto.rs
Normal file
39
build/configure/src/proto.rs
Normal file
@ -0,0 +1,39 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use ninja_gen::{
|
||||
archives::{download_and_extract, with_exe},
|
||||
glob, hashmap, inputs,
|
||||
protobuf::{protoc_archive, ClangFormat},
|
||||
Build, Result,
|
||||
};
|
||||
|
||||
pub fn download_protoc(build: &mut Build) -> Result<()> {
|
||||
download_and_extract(
|
||||
build,
|
||||
"protoc",
|
||||
protoc_archive(build.host_platform),
|
||||
hashmap! {
|
||||
"bin" => [with_exe("bin/protoc")]
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn check_proto(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"check:format:proto",
|
||||
ClangFormat {
|
||||
inputs: inputs![glob!["proto/**/*.proto"]],
|
||||
check_only: true,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"format:proto",
|
||||
ClangFormat {
|
||||
inputs: inputs![glob!["proto/**/*.proto"]],
|
||||
check_only: false,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
131
build/configure/src/pylib.rs
Normal file
131
build/configure/src/pylib.rs
Normal file
@ -0,0 +1,131 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use ninja_gen::{
|
||||
action::BuildAction,
|
||||
archives::Platform,
|
||||
command::RunCommand,
|
||||
copy::LinkFile,
|
||||
glob, hashmap, inputs,
|
||||
python::{python_format, PythonTest},
|
||||
Build, Result,
|
||||
};
|
||||
|
||||
use crate::{
|
||||
platform::overriden_python_target_platform,
|
||||
python::{BuildWheel, GenPythonProto},
|
||||
};
|
||||
|
||||
pub fn build_pylib(build: &mut Build) -> Result<()> {
|
||||
// generated files
|
||||
build.add(
|
||||
"pylib/anki:proto",
|
||||
GenPythonProto {
|
||||
proto_files: inputs![glob!["proto/anki/*.proto"]],
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylib/anki:_backend_generated.py",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["pylib/tools/genbackend.py"],
|
||||
"" => inputs!["pylib/anki/_vendor/stringcase.py", ":pylib/anki:proto"]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["pylib/anki/_backend_generated.py"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylib/anki:_fluent.py",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $strings $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["pylib/tools/genfluent.py"],
|
||||
"strings" => inputs![":rslib/i18n:strings.json"],
|
||||
"" => inputs!["pylib/anki/_vendor/stringcase.py"]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["pylib/anki/_fluent.py"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylib/anki:hooks_gen.py",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["pylib/tools/genhooks.py"],
|
||||
"" => inputs!["pylib/anki/_vendor/stringcase.py", "pylib/tools/hookslib.py"]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["pylib/anki/hooks_gen.py"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylib/anki:_rsbridge",
|
||||
LinkFile {
|
||||
input: inputs![":pylib/rsbridge"],
|
||||
output: &format!(
|
||||
"pylib/anki/_rsbridge.{}",
|
||||
match build.host_platform {
|
||||
Platform::WindowsX64 => "pyd",
|
||||
_ => "so",
|
||||
}
|
||||
),
|
||||
},
|
||||
)?;
|
||||
build.add("pylib/anki:buildinfo.py", GenBuildInfo {})?;
|
||||
|
||||
// wheel
|
||||
build.add(
|
||||
"wheels:anki",
|
||||
BuildWheel {
|
||||
name: "anki",
|
||||
version: include_str!("../../../.version").trim(),
|
||||
src_folder: "pylib/anki",
|
||||
gen_folder: "$builddir/pylib/anki",
|
||||
platform: overriden_python_target_platform().or(Some(build.host_platform)),
|
||||
deps: inputs![
|
||||
":pylib/anki",
|
||||
glob!("pylib/anki/**"),
|
||||
"python/requirements.anki.in"
|
||||
],
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn check_pylib(build: &mut Build) -> Result<()> {
|
||||
python_format(build, "pylib", inputs![glob!("pylib/**/*.py")])?;
|
||||
|
||||
build.add(
|
||||
"check:pytest:pylib",
|
||||
PythonTest {
|
||||
folder: "pylib/tests",
|
||||
python_path: &["$builddir/pylib"],
|
||||
deps: inputs![":pylib/anki", glob!["pylib/{anki,tests}/**"]],
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
pub struct GenBuildInfo {}
|
||||
|
||||
impl BuildAction for GenBuildInfo {
|
||||
fn command(&self) -> &str {
|
||||
"$pyenv_bin $script $version_file $buildhash_file $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
build.add_inputs("pyenv_bin", inputs![":pyenv:bin"]);
|
||||
build.add_inputs("script", inputs!["pylib/tools/genbuildinfo.py"]);
|
||||
build.add_inputs("version_file", inputs![".version"]);
|
||||
build.add_inputs("buildhash_file", inputs!["$builddir/buildhash"]);
|
||||
build.add_outputs("out", vec!["pylib/anki/buildinfo.py"]);
|
||||
}
|
||||
}
|
334
build/configure/src/python.rs
Normal file
334
build/configure/src/python.rs
Normal file
@ -0,0 +1,334 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::env;
|
||||
|
||||
use ninja_gen::{
|
||||
action::BuildAction,
|
||||
archives::{download_and_extract, OnlineArchive, Platform},
|
||||
build::FilesHandle,
|
||||
command::RunCommand,
|
||||
glob, hashmap,
|
||||
input::BuildInput,
|
||||
inputs,
|
||||
python::{python_format, PythonEnvironment, PythonLint, PythonTypecheck},
|
||||
rsync::RsyncFiles,
|
||||
Build, Result, Utf8Path,
|
||||
};
|
||||
|
||||
fn python_archive(platform: Platform) -> OnlineArchive {
|
||||
match platform {
|
||||
Platform::LinuxX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/indygreg/python-build-standalone/releases/download/20221106/cpython-3.9.15+20221106-x86_64_v2-unknown-linux-gnu-install_only.tar.gz",
|
||||
sha256: "436c35bd809abdd028f386cc623ae020c77e6b544eaaca405098387c4daa444c",
|
||||
}
|
||||
}
|
||||
Platform::LinuxArm => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/ankitects/python-build-standalone/releases/download/anki-2022-02-18/cpython-3.9.10-aarch64-unknown-linux-gnu-install_only-20220218T1329.tar.gz",
|
||||
sha256: "39070f9b9492dce3085c8c98916940434bb65663e6665b2c87bef86025532c1a",
|
||||
}
|
||||
}
|
||||
Platform::MacX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/indygreg/python-build-standalone/releases/download/20211012/cpython-3.9.7-x86_64-apple-darwin-install_only-20211011T1926.tar.gz",
|
||||
sha256: "43cb1a83919f49b1ce95e42f9c461d8a9fb00ff3957bebef9cffe61a5f362377",
|
||||
}
|
||||
}
|
||||
Platform::MacArm => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/indygreg/python-build-standalone/releases/download/20221106/cpython-3.9.15+20221106-aarch64-apple-darwin-install_only.tar.gz",
|
||||
sha256: "64dc7e1013481c9864152c3dd806c41144c79d5e9cd3140e185c6a5060bdc9ab",
|
||||
}
|
||||
}
|
||||
Platform::WindowsX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/indygreg/python-build-standalone/releases/download/20211012/cpython-3.9.7-x86_64-pc-windows-msvc-shared-install_only-20211011T1926.tar.gz",
|
||||
sha256: "80370f232fd63d5cb3ff9418121acb87276228b0dafbeee3c57af143aca11f89",
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the Python binary, which can be used to create venvs.
|
||||
/// Downloads if missing.
|
||||
pub fn setup_python(build: &mut Build) -> Result<BuildInput> {
|
||||
// if changing this, make sure you remove out/pyenv
|
||||
let python_binary = match env::var("PYTHON_BINARY") {
|
||||
Ok(path) => {
|
||||
assert!(
|
||||
Utf8Path::new(&path).is_absolute(),
|
||||
"PYTHON_BINARY must be absolute"
|
||||
);
|
||||
path.into()
|
||||
}
|
||||
Err(_) => {
|
||||
download_and_extract(
|
||||
build,
|
||||
"python",
|
||||
python_archive(build.host_platform),
|
||||
hashmap! { "bin" => [
|
||||
if cfg!(windows) { "python.exe" } else { "bin/python3"}
|
||||
] },
|
||||
)?;
|
||||
inputs![":extract:python:bin"]
|
||||
}
|
||||
};
|
||||
Ok(python_binary)
|
||||
}
|
||||
|
||||
pub fn setup_venv(build: &mut Build, python_binary: &BuildInput) -> Result<()> {
|
||||
let requirements_txt = if cfg!(windows) {
|
||||
inputs![
|
||||
"python/requirements.dev.txt",
|
||||
"python/requirements.qt6.txt",
|
||||
"python/requirements.win.txt",
|
||||
]
|
||||
} else if cfg!(all(target_os = "linux", target_arch = "aarch64")) {
|
||||
inputs!["python/requirements.dev.txt"]
|
||||
} else {
|
||||
inputs!["python/requirements.dev.txt", "python/requirements.qt6.txt",]
|
||||
};
|
||||
build.add(
|
||||
"pyenv",
|
||||
PythonEnvironment {
|
||||
folder: "pyenv",
|
||||
base_requirements_txt: inputs!["python/requirements.base.txt"],
|
||||
requirements_txt,
|
||||
python_binary,
|
||||
extra_binary_exports: &[
|
||||
"pip-compile",
|
||||
"pip-sync",
|
||||
"mypy",
|
||||
"black",
|
||||
"isort",
|
||||
"pylint",
|
||||
"pytest",
|
||||
"protoc-gen-mypy",
|
||||
],
|
||||
},
|
||||
)?;
|
||||
|
||||
// optional venvs for testing with Qt5
|
||||
let mut reqs_qt5 = inputs!["python/requirements.bundle.txt"];
|
||||
if cfg!(windows) {
|
||||
reqs_qt5 = inputs![reqs_qt5, "python/requirements.win.txt"];
|
||||
}
|
||||
|
||||
build.add(
|
||||
"pyenv-qt5.15",
|
||||
PythonEnvironment {
|
||||
folder: "pyenv-qt5.15",
|
||||
base_requirements_txt: inputs!["python/requirements.base.txt"],
|
||||
requirements_txt: inputs![&reqs_qt5, "python/requirements.qt5_15.txt"],
|
||||
python_binary,
|
||||
extra_binary_exports: &[],
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pyenv-qt5.14",
|
||||
PythonEnvironment {
|
||||
folder: "pyenv-qt5.14",
|
||||
base_requirements_txt: inputs!["python/requirements.base.txt"],
|
||||
requirements_txt: inputs![reqs_qt5, "python/requirements.qt5_14.txt"],
|
||||
python_binary,
|
||||
extra_binary_exports: &[],
|
||||
},
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub struct GenPythonProto {
|
||||
pub proto_files: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for GenPythonProto {
|
||||
fn command(&self) -> &str {
|
||||
"$protoc $
|
||||
--plugin=protoc-gen-mypy=$protoc-gen-mypy $
|
||||
--python_out=$builddir/pylib $
|
||||
--mypy_out=$builddir/pylib $
|
||||
-Iproto $in"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl ninja_gen::build::FilesHandle) {
|
||||
let proto_inputs = build.expand_inputs(&self.proto_files);
|
||||
let python_outputs: Vec<_> = proto_inputs
|
||||
.iter()
|
||||
.flat_map(|path| {
|
||||
let path = path
|
||||
.replace('\\', "/")
|
||||
.replace("proto/", "pylib/")
|
||||
.replace(".proto", "_pb2");
|
||||
[format!("{path}.py"), format!("{path}.pyi")]
|
||||
})
|
||||
.collect();
|
||||
build.add_inputs("in", &self.proto_files);
|
||||
build.add_inputs("protoc", inputs![":extract:protoc:bin"]);
|
||||
build.add_inputs("protoc-gen-mypy", inputs![":pyenv:protoc-gen-mypy"]);
|
||||
build.add_outputs("", python_outputs);
|
||||
}
|
||||
}
|
||||
|
||||
pub struct BuildWheel {
|
||||
pub name: &'static str,
|
||||
pub version: &'static str,
|
||||
pub src_folder: &'static str,
|
||||
pub gen_folder: &'static str,
|
||||
pub platform: Option<Platform>,
|
||||
pub deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for BuildWheel {
|
||||
fn command(&self) -> &str {
|
||||
"$pyenv_bin $script $src $gen $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs("pyenv_bin", inputs![":pyenv:bin"]);
|
||||
build.add_inputs("script", inputs!["python/write_wheel.py"]);
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_variable("src", self.src_folder);
|
||||
build.add_variable("gen", self.gen_folder);
|
||||
|
||||
let tag = if let Some(platform) = self.platform {
|
||||
let platform = match platform {
|
||||
Platform::LinuxX64 => "manylinux_2_28_x86_64",
|
||||
Platform::LinuxArm => "manylinux_2_31_aarch64",
|
||||
Platform::MacX64 => "macosx_10_13_x86_64",
|
||||
Platform::MacArm => "macosx_11_0_arm64",
|
||||
Platform::WindowsX64 => "win_amd64",
|
||||
};
|
||||
format!("cp39-abi3-{platform}")
|
||||
} else {
|
||||
"py3-none-any".into()
|
||||
};
|
||||
let name = self.name;
|
||||
let version = self.version;
|
||||
let wheel_path = format!("wheels/{name}-{version}-{tag}.whl");
|
||||
build.add_outputs("out", vec![wheel_path]);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn check_python(build: &mut Build) -> Result<()> {
|
||||
python_format(build, "ftl", inputs![glob!("ftl/**/*.py")])?;
|
||||
python_format(build, "tools", inputs![glob!("tools/**/*.py")])?;
|
||||
|
||||
build.add(
|
||||
"check:mypy",
|
||||
PythonTypecheck {
|
||||
folders: &[
|
||||
"pylib",
|
||||
"ts/lib",
|
||||
"qt/aqt",
|
||||
"qt/tools",
|
||||
"out/pylib/anki",
|
||||
"out/qt/_aqt",
|
||||
"ftl",
|
||||
"python",
|
||||
"tools",
|
||||
],
|
||||
deps: inputs![glob!["{pylib,ftl,qt}/**/*.{py,pyi}"], ":pylib/anki"],
|
||||
},
|
||||
)?;
|
||||
|
||||
add_pylint(build)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn add_pylint(build: &mut Build) -> Result<()> {
|
||||
// pylint does not support PEP420 implicit namespaces split across import paths,
|
||||
// so we need to merge our pylib sources and generated files before invoking it, and
|
||||
// add a top-level __init__.py
|
||||
build.add(
|
||||
"pylint/anki",
|
||||
RsyncFiles {
|
||||
inputs: inputs![":pylib/anki"],
|
||||
target_folder: "pylint/anki",
|
||||
strip_prefix: "$builddir/pylib/anki",
|
||||
// avoid copying our large rsbridge binary
|
||||
extra_args: "--links",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylint/anki",
|
||||
RsyncFiles {
|
||||
inputs: inputs![glob!["pylib/anki/**"]],
|
||||
target_folder: "pylint/anki",
|
||||
strip_prefix: "pylib/anki",
|
||||
extra_args: "",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"pylint/anki",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $out",
|
||||
inputs: hashmap! { "script" => inputs!["python/mkempty.py"] },
|
||||
outputs: hashmap! { "out" => vec!["pylint/anki/__init__.py"] },
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"check:pylint",
|
||||
PythonLint {
|
||||
folders: &[
|
||||
"$builddir/pylint/anki",
|
||||
"qt/aqt",
|
||||
"ftl",
|
||||
"pylib/tools",
|
||||
"tools",
|
||||
"python",
|
||||
],
|
||||
pylint_ini: inputs![".pylintrc"],
|
||||
deps: inputs![
|
||||
":pylint/anki",
|
||||
glob!("{pylib/tools,ftl,qt,python,tools}/**/*.py")
|
||||
],
|
||||
},
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn check_copyright(build: &mut Build) -> Result<()> {
|
||||
let script = inputs!["tools/copyright_headers.py"];
|
||||
let files = inputs![glob![
|
||||
"{build,rslib,pylib,qt,ftl,python,sass,tools,ts}/**/*.{py,rs,ts,svelte,mjs}",
|
||||
"qt/bundle/PyOxidizer/**"
|
||||
]];
|
||||
build.add(
|
||||
"check:copyright",
|
||||
RunCommand {
|
||||
command: "$runner",
|
||||
args: "run --stamp=$out $pyenv_bin $script check",
|
||||
inputs: hashmap! {
|
||||
"pyenv_bin" => inputs![":pyenv:bin"],
|
||||
"script" => script.clone(),
|
||||
"script" => script.clone(),
|
||||
"" => files.clone(),
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["tests/copyright.check.marker"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"fix:copyright",
|
||||
RunCommand {
|
||||
command: "$runner",
|
||||
args: "run --stamp=$out $pyenv_bin $script fix",
|
||||
inputs: hashmap! {
|
||||
"pyenv_bin" => inputs![":pyenv:bin"],
|
||||
"script" => script,
|
||||
"" => files,
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec!["tests/copyright.fix.marker"]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
134
build/configure/src/rust.rs
Normal file
134
build/configure/src/rust.rs
Normal file
@ -0,0 +1,134 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use ninja_gen::{
|
||||
cargo::{CargoBuild, CargoClippy, CargoFormat, CargoRun, CargoTest, RustOutput},
|
||||
git::SyncSubmodule,
|
||||
glob, inputs, Build, Result,
|
||||
};
|
||||
|
||||
use crate::{platform::overriden_rust_target_triple, proto::download_protoc};
|
||||
|
||||
pub fn build_rust(build: &mut Build) -> Result<()> {
|
||||
prepare_translations(build)?;
|
||||
download_protoc(build)?;
|
||||
build_rsbridge(build)
|
||||
}
|
||||
|
||||
fn prepare_translations(build: &mut Build) -> Result<()> {
|
||||
// ensure repos are checked out
|
||||
build.add(
|
||||
"ftl:repo:core",
|
||||
SyncSubmodule {
|
||||
path: "ftl/core-repo",
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"ftl:repo:qt",
|
||||
SyncSubmodule {
|
||||
path: "ftl/qt-repo",
|
||||
},
|
||||
)?;
|
||||
// build anki_i18n and spit out strings.json
|
||||
build.add(
|
||||
"rslib/i18n",
|
||||
CargoBuild {
|
||||
inputs: inputs![
|
||||
glob!["rslib/i18n/**"],
|
||||
glob!["ftl/{core,core-repo,qt,qt-repo}/**"],
|
||||
":ftl:repo",
|
||||
],
|
||||
outputs: &[RustOutput::Data(
|
||||
"strings.json",
|
||||
"$builddir/rslib/i18n/strings.json",
|
||||
)],
|
||||
target: None,
|
||||
extra_args: "-p anki_i18n",
|
||||
release_override: None,
|
||||
},
|
||||
)?;
|
||||
|
||||
build.add(
|
||||
"ftl:sync",
|
||||
CargoRun {
|
||||
binary_name: "ftl-sync",
|
||||
cargo_args: "-p ftl",
|
||||
bin_args: "",
|
||||
deps: inputs![":ftl:repo", glob!["ftl/{core,core-repo,qt,qt-repo}/**"]],
|
||||
},
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_rsbridge(build: &mut Build) -> Result<()> {
|
||||
let features = if cfg!(target_os = "linux") {
|
||||
"rustls"
|
||||
} else {
|
||||
"native-tls"
|
||||
};
|
||||
build.add(
|
||||
"pylib/rsbridge",
|
||||
CargoBuild {
|
||||
inputs: inputs![
|
||||
glob!["{pylib/rsbridge/**,rslib/**,proto/**}"],
|
||||
":extract:protoc:bin",
|
||||
// declare a dependency on i18n so it gets built first, allowing
|
||||
// things depending on strings.json to build faster, and ensuring
|
||||
// changes to the ftl files trigger a rebuild
|
||||
":rslib/i18n",
|
||||
// when env vars change the build hash gets updated
|
||||
"$builddir/build.ninja",
|
||||
// building on Windows requires python3.lib
|
||||
if cfg!(windows) {
|
||||
inputs![":extract:python"]
|
||||
} else {
|
||||
inputs![]
|
||||
}
|
||||
],
|
||||
outputs: &[RustOutput::DynamicLib("rsbridge")],
|
||||
target: overriden_rust_target_triple(),
|
||||
extra_args: &format!("-p rsbridge --features {features}"),
|
||||
release_override: None,
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
pub fn check_rust(build: &mut Build) -> Result<()> {
|
||||
let inputs = inputs![
|
||||
glob!("{rslib/**,pylib/rsbridge/**,build/**,tools/workspace-hack/**}"),
|
||||
"Cargo.lock",
|
||||
"Cargo.toml",
|
||||
"rust-toolchain.toml",
|
||||
];
|
||||
build.add(
|
||||
"check:format:rust",
|
||||
CargoFormat {
|
||||
inputs: inputs.clone(),
|
||||
check_only: true,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"format:rust",
|
||||
CargoFormat {
|
||||
inputs: inputs.clone(),
|
||||
check_only: false,
|
||||
},
|
||||
)?;
|
||||
|
||||
let inputs = inputs![
|
||||
inputs,
|
||||
// defer tests until build has completed; ensure re-run on changes
|
||||
":pylib/rsbridge"
|
||||
];
|
||||
|
||||
build.add(
|
||||
"check:clippy",
|
||||
CargoClippy {
|
||||
inputs: inputs.clone(),
|
||||
},
|
||||
)?;
|
||||
build.add("check:rust_test", CargoTest { inputs })?;
|
||||
|
||||
Ok(())
|
||||
}
|
575
build/configure/src/web.rs
Normal file
575
build/configure/src/web.rs
Normal file
@ -0,0 +1,575 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
// use super::*;
|
||||
use ninja_gen::{
|
||||
action::BuildAction,
|
||||
command::RunCommand,
|
||||
glob, hashmap,
|
||||
input::BuildInput,
|
||||
inputs,
|
||||
node::{
|
||||
node_archive, CompileSass, DPrint, EsbuildScript, Eslint, GenTypescriptProto, JestTest,
|
||||
SqlFormat, SvelteCheck, TypescriptCheck,
|
||||
},
|
||||
rsync::RsyncFiles,
|
||||
Build, Result,
|
||||
};
|
||||
|
||||
pub fn build_and_check_web(build: &mut Build) -> Result<()> {
|
||||
setup_node(build)?;
|
||||
build_sass(build)?;
|
||||
build_and_check_tslib(build)?;
|
||||
declare_and_check_other_libraries(build)?;
|
||||
build_and_check_pages(build)?;
|
||||
build_and_check_editor(build)?;
|
||||
build_and_check_reviewer(build)?;
|
||||
build_and_check_mathjax(build)?;
|
||||
check_web(build)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn setup_node(build: &mut Build) -> Result<()> {
|
||||
ninja_gen::node::setup_node(
|
||||
build,
|
||||
node_archive(build.host_platform),
|
||||
&[
|
||||
"dprint",
|
||||
"svelte-check",
|
||||
"eslint",
|
||||
"sass",
|
||||
"tsc",
|
||||
"tsx",
|
||||
"pbjs",
|
||||
"pbts",
|
||||
"jest",
|
||||
],
|
||||
hashmap! {
|
||||
"jquery" => vec![
|
||||
"jquery/dist/jquery.min.js".into()
|
||||
],
|
||||
"jquery-ui" => vec![
|
||||
"jquery-ui-dist/jquery-ui.min.js".into()
|
||||
],
|
||||
"css-browser-selector" => vec![
|
||||
"css-browser-selector/css_browser_selector.min.js".into(),
|
||||
],
|
||||
"bootstrap-dist" => vec![
|
||||
"bootstrap/dist/js/bootstrap.bundle.min.js".into(),
|
||||
],
|
||||
"mathjax" => MATHJAX_FILES.iter().map(|&v| v.into()).collect(),
|
||||
"mdi_unthemed" => [
|
||||
// saved searches
|
||||
"heart-outline.svg",
|
||||
// today
|
||||
"clock-outline.svg",
|
||||
// state
|
||||
"circle.svg",
|
||||
"circle-outline.svg",
|
||||
// flags
|
||||
"flag-variant.svg",
|
||||
"flag-variant-outline.svg",
|
||||
"flag-variant-off-outline.svg",
|
||||
// decks
|
||||
"book-outline.svg",
|
||||
"book-clock-outline.svg",
|
||||
"book-cog-outline.svg",
|
||||
// notetypes
|
||||
"newspaper.svg",
|
||||
// cardtype
|
||||
"application-braces-outline.svg",
|
||||
// fields
|
||||
"form-textbox.svg",
|
||||
// tags
|
||||
"tag-outline.svg",
|
||||
"tag-off-outline.svg",
|
||||
].iter().map(|file| format!("@mdi/svg/svg/{file}").into()).collect(),
|
||||
"mdi_themed" => [
|
||||
// sidebar tools
|
||||
"magnify.svg",
|
||||
"selection-drag.svg",
|
||||
// QComboBox arrows
|
||||
"chevron-up.svg",
|
||||
"chevron-down.svg",
|
||||
// QHeaderView arrows
|
||||
"menu-up.svg",
|
||||
"menu-down.svg",
|
||||
// drag handle
|
||||
"drag-vertical.svg",
|
||||
"drag-horizontal.svg",
|
||||
// checkbox
|
||||
"check.svg",
|
||||
"minus-thick.svg",
|
||||
// QRadioButton
|
||||
"circle-medium.svg",
|
||||
].iter().map(|file| format!("@mdi/svg/svg/{file}").into()).collect(),
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_and_check_tslib(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"ts:lib:i18n",
|
||||
RunCommand {
|
||||
command: ":pyenv:bin",
|
||||
args: "$script $strings $out",
|
||||
inputs: hashmap! {
|
||||
"script" => inputs!["ts/lib/genfluent.py"],
|
||||
"strings" => inputs![":rslib/i18n:strings.json"],
|
||||
"" => inputs!["pylib/anki/_vendor/stringcase.py"]
|
||||
},
|
||||
outputs: hashmap! {
|
||||
"out" => vec![
|
||||
"ts/lib/ftl.js",
|
||||
"ts/lib/ftl.d.ts",
|
||||
"ts/lib/i18n/modules.js",
|
||||
"ts/lib/i18n/modules.d.ts"
|
||||
]
|
||||
},
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"ts:lib:backend_proto.d.ts",
|
||||
GenTypescriptProto {
|
||||
protos: inputs![glob!["proto/anki/*.proto"]],
|
||||
output_stem: "ts/lib/backend_proto",
|
||||
},
|
||||
)?;
|
||||
|
||||
let src_files = inputs![glob!["ts/lib/**"]];
|
||||
eslint(build, "lib", "ts/lib", inputs![":ts:lib", &src_files])?;
|
||||
|
||||
build.add(
|
||||
"check:jest:lib",
|
||||
jest_test("ts/lib", inputs![":ts:lib", &src_files], true),
|
||||
)?;
|
||||
|
||||
build.add_inputs_to_group("ts:lib", src_files);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn jest_test(folder: &str, deps: BuildInput, jsdom: bool) -> impl BuildAction + '_ {
|
||||
JestTest {
|
||||
folder,
|
||||
deps,
|
||||
jest_rc: "ts/jest.config.js".into(),
|
||||
jsdom,
|
||||
}
|
||||
}
|
||||
|
||||
fn declare_and_check_other_libraries(build: &mut Build) -> Result<()> {
|
||||
for (library, inputs) in [
|
||||
("sveltelib", inputs![":ts:lib", glob!("ts/sveltelib/**")]),
|
||||
("domlib", inputs![":ts:lib", glob!("ts/domlib/**")]),
|
||||
(
|
||||
"components",
|
||||
inputs![":ts:lib", ":ts:sveltelib", glob!("ts/components/**")],
|
||||
),
|
||||
("html-filter", inputs![glob!("ts/html-filter/**")]),
|
||||
] {
|
||||
let library_with_ts = format!("ts:{library}");
|
||||
let folder = library_with_ts.replace(':', "/");
|
||||
build.add_inputs_to_group(&library_with_ts, inputs.clone());
|
||||
eslint(build, library, &folder, inputs.clone())?;
|
||||
|
||||
if matches!(library, "domlib" | "html-filter") {
|
||||
build.add(
|
||||
&format!("check:jest:{library}"),
|
||||
jest_test(&folder, inputs, true),
|
||||
)?;
|
||||
}
|
||||
}
|
||||
|
||||
eslint(
|
||||
build,
|
||||
"sql_format",
|
||||
"ts/sql_format",
|
||||
inputs![glob!("ts/sql_format/**")],
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn eslint(build: &mut Build, name: &str, folder: &str, deps: BuildInput) -> Result<()> {
|
||||
let eslint_rc = inputs![".eslintrc.js"];
|
||||
build.add(
|
||||
format!("check:eslint:{name}"),
|
||||
Eslint {
|
||||
folder,
|
||||
inputs: deps.clone(),
|
||||
eslint_rc: eslint_rc.clone(),
|
||||
fix: false,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
format!("fix:eslint:{name}"),
|
||||
Eslint {
|
||||
folder,
|
||||
inputs: deps,
|
||||
eslint_rc,
|
||||
fix: true,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_and_check_pages(build: &mut Build) -> Result<()> {
|
||||
build.add_inputs_to_group("ts:tag-editor", inputs![glob!["ts/tag-editor/**"]]);
|
||||
|
||||
let mut build_page = |name: &str, html: bool, deps: BuildInput| -> Result<()> {
|
||||
let group = format!("ts:pages:{name}");
|
||||
let deps = inputs![deps, glob!(format!("ts/{name}/**"))];
|
||||
let extra_exts = if html { &["css", "html"][..] } else { &["css"] };
|
||||
build.add(
|
||||
&group,
|
||||
EsbuildScript {
|
||||
script: inputs!["ts/bundle_svelte.mjs"],
|
||||
entrypoint: inputs![format!("ts/{name}/index.ts")],
|
||||
output_stem: &format!("ts/{name}/{name}"),
|
||||
deps: deps.clone(),
|
||||
extra_exts,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
format!("check:svelte:{name}"),
|
||||
SvelteCheck {
|
||||
tsconfig: inputs![format!("ts/{name}/tsconfig.json")],
|
||||
inputs: deps.clone(),
|
||||
},
|
||||
)?;
|
||||
let folder = format!("ts/{name}");
|
||||
eslint(build, name, &folder, deps.clone())?;
|
||||
if matches!(name, "deck-options" | "change-notetype") {
|
||||
build.add(
|
||||
&format!("check:jest:{name}"),
|
||||
jest_test(&folder, deps, false),
|
||||
)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
};
|
||||
build_page(
|
||||
"congrats",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":sass",
|
||||
],
|
||||
)?;
|
||||
build_page(
|
||||
"deck-options",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":ts:sveltelib",
|
||||
":sass",
|
||||
],
|
||||
)?;
|
||||
build_page(
|
||||
"graphs",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":sass",
|
||||
],
|
||||
)?;
|
||||
build_page(
|
||||
"card-info",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":sass",
|
||||
],
|
||||
)?;
|
||||
build_page(
|
||||
"change-notetype",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":ts:sveltelib",
|
||||
":sass",
|
||||
],
|
||||
)?;
|
||||
build_page(
|
||||
"import-csv",
|
||||
true,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":ts:sveltelib",
|
||||
":ts:tag-editor",
|
||||
":sass"
|
||||
],
|
||||
)?;
|
||||
// we use the generated .css file separately
|
||||
build_page(
|
||||
"editable",
|
||||
false,
|
||||
inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":ts:domlib",
|
||||
":ts:sveltelib",
|
||||
":sass"
|
||||
],
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_and_check_editor(build: &mut Build) -> Result<()> {
|
||||
let editor_deps = inputs![
|
||||
//
|
||||
":ts:lib",
|
||||
":ts:components",
|
||||
":ts:domlib",
|
||||
":ts:sveltelib",
|
||||
":ts:tag-editor",
|
||||
":ts:html-filter",
|
||||
":sass",
|
||||
glob!("ts/{editable,editor}/**")
|
||||
];
|
||||
|
||||
let mut build_editor_page = |name: &str, entrypoint: &str| -> Result<()> {
|
||||
let stem = format!("ts/editor/{name}");
|
||||
build.add(
|
||||
"ts:editor",
|
||||
EsbuildScript {
|
||||
script: inputs!["ts/bundle_svelte.mjs"],
|
||||
entrypoint: inputs![format!("ts/editor/{entrypoint}.ts")],
|
||||
output_stem: &stem,
|
||||
deps: editor_deps.clone(),
|
||||
extra_exts: &["css"],
|
||||
},
|
||||
)
|
||||
};
|
||||
|
||||
build_editor_page("browser_editor", "index_browser")?;
|
||||
build_editor_page("reviewer_editor", "index_reviewer")?;
|
||||
build_editor_page("note_creator", "index_creator")?;
|
||||
|
||||
let group = "ts/editor";
|
||||
build.add(
|
||||
"check:svelte:editor",
|
||||
SvelteCheck {
|
||||
tsconfig: inputs![format!("{group}/tsconfig.json")],
|
||||
inputs: editor_deps.clone(),
|
||||
},
|
||||
)?;
|
||||
eslint(build, "editor", group, editor_deps)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_and_check_reviewer(build: &mut Build) -> Result<()> {
|
||||
let reviewer_deps = inputs![":ts:lib", glob!("ts/reviewer/**")];
|
||||
build.add(
|
||||
"ts:reviewer:reviewer.js",
|
||||
EsbuildScript {
|
||||
script: inputs!["ts/bundle_ts.mjs"],
|
||||
entrypoint: "ts/reviewer/index_wrapper.ts".into(),
|
||||
output_stem: "ts/reviewer/reviewer",
|
||||
deps: reviewer_deps.clone(),
|
||||
extra_exts: &[],
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"ts:reviewer:reviewer.css",
|
||||
CompileSass {
|
||||
input: inputs!["ts/reviewer/reviewer.scss"],
|
||||
output: "ts/reviewer/reviewer.css",
|
||||
deps: ":sass".into(),
|
||||
load_paths: vec!["."],
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"ts:reviewer:reviewer_extras_bundle.js",
|
||||
EsbuildScript {
|
||||
script: inputs!["ts/bundle_ts.mjs"],
|
||||
entrypoint: "ts/reviewer/reviewer_extras.ts".into(),
|
||||
output_stem: "ts/reviewer/reviewer_extras_bundle",
|
||||
deps: reviewer_deps.clone(),
|
||||
extra_exts: &[],
|
||||
},
|
||||
)?;
|
||||
|
||||
build.add(
|
||||
"check:typescript:reviewer",
|
||||
TypescriptCheck {
|
||||
tsconfig: inputs!["ts/reviewer/tsconfig.json"],
|
||||
inputs: reviewer_deps.clone(),
|
||||
},
|
||||
)?;
|
||||
eslint(build, "reviewer", "ts/reviewer", reviewer_deps)
|
||||
}
|
||||
|
||||
fn check_web(build: &mut Build) -> Result<()> {
|
||||
let dprint_files = inputs![glob!["**/*.{ts,mjs,js,md,json,toml}", "target/**"]];
|
||||
build.add(
|
||||
"check:format:dprint",
|
||||
DPrint {
|
||||
inputs: dprint_files.clone(),
|
||||
check_only: true,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"format:dprint",
|
||||
DPrint {
|
||||
inputs: dprint_files,
|
||||
check_only: false,
|
||||
},
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn check_sql(build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"check:format:sql",
|
||||
SqlFormat {
|
||||
inputs: inputs![glob!["**/*.sql"]],
|
||||
check_only: true,
|
||||
},
|
||||
)?;
|
||||
build.add(
|
||||
"format:sql",
|
||||
SqlFormat {
|
||||
inputs: inputs![glob!["**/*.sql"]],
|
||||
check_only: false,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn build_and_check_mathjax(build: &mut Build) -> Result<()> {
|
||||
let files = inputs![glob!["ts/mathjax/*"]];
|
||||
build.add(
|
||||
"ts:mathjax",
|
||||
EsbuildScript {
|
||||
script: "ts/transform_ts.mjs".into(),
|
||||
entrypoint: "ts/mathjax/index.ts".into(),
|
||||
deps: files.clone(),
|
||||
output_stem: "ts/mathjax/mathjax",
|
||||
extra_exts: &[],
|
||||
},
|
||||
)?;
|
||||
eslint(build, "mathjax", "ts/mathjax", files.clone())?;
|
||||
build.add(
|
||||
"check:typescript:mathjax",
|
||||
TypescriptCheck {
|
||||
tsconfig: "ts/mathjax/tsconfig.json".into(),
|
||||
inputs: files,
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
pub const MATHJAX_FILES: &[&str] = &[
|
||||
"mathjax/es5/a11y/assistive-mml.js",
|
||||
"mathjax/es5/a11y/complexity.js",
|
||||
"mathjax/es5/a11y/explorer.js",
|
||||
"mathjax/es5/a11y/semantic-enrich.js",
|
||||
"mathjax/es5/input/tex/extensions/action.js",
|
||||
"mathjax/es5/input/tex/extensions/all-packages.js",
|
||||
"mathjax/es5/input/tex/extensions/ams.js",
|
||||
"mathjax/es5/input/tex/extensions/amscd.js",
|
||||
"mathjax/es5/input/tex/extensions/autoload.js",
|
||||
"mathjax/es5/input/tex/extensions/bbox.js",
|
||||
"mathjax/es5/input/tex/extensions/boldsymbol.js",
|
||||
"mathjax/es5/input/tex/extensions/braket.js",
|
||||
"mathjax/es5/input/tex/extensions/bussproofs.js",
|
||||
"mathjax/es5/input/tex/extensions/cancel.js",
|
||||
"mathjax/es5/input/tex/extensions/centernot.js",
|
||||
"mathjax/es5/input/tex/extensions/color.js",
|
||||
"mathjax/es5/input/tex/extensions/colortbl.js",
|
||||
"mathjax/es5/input/tex/extensions/colorv2.js",
|
||||
"mathjax/es5/input/tex/extensions/configmacros.js",
|
||||
"mathjax/es5/input/tex/extensions/enclose.js",
|
||||
"mathjax/es5/input/tex/extensions/extpfeil.js",
|
||||
"mathjax/es5/input/tex/extensions/gensymb.js",
|
||||
"mathjax/es5/input/tex/extensions/html.js",
|
||||
"mathjax/es5/input/tex/extensions/mathtools.js",
|
||||
"mathjax/es5/input/tex/extensions/mhchem.js",
|
||||
"mathjax/es5/input/tex/extensions/newcommand.js",
|
||||
"mathjax/es5/input/tex/extensions/noerrors.js",
|
||||
"mathjax/es5/input/tex/extensions/noundefined.js",
|
||||
"mathjax/es5/input/tex/extensions/physics.js",
|
||||
"mathjax/es5/input/tex/extensions/require.js",
|
||||
"mathjax/es5/input/tex/extensions/setoptions.js",
|
||||
"mathjax/es5/input/tex/extensions/tagformat.js",
|
||||
"mathjax/es5/input/tex/extensions/textcomp.js",
|
||||
"mathjax/es5/input/tex/extensions/textmacros.js",
|
||||
"mathjax/es5/input/tex/extensions/unicode.js",
|
||||
"mathjax/es5/input/tex/extensions/upgreek.js",
|
||||
"mathjax/es5/input/tex/extensions/verb.js",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_AMS-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Calligraphic-Bold.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Calligraphic-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Fraktur-Bold.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Fraktur-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Main-Bold.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Main-Italic.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Main-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Math-BoldItalic.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Math-Italic.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Math-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_SansSerif-Bold.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_SansSerif-Italic.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_SansSerif-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Script-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Size1-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Size2-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Size3-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Size4-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Typewriter-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Vector-Bold.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Vector-Regular.woff",
|
||||
"mathjax/es5/output/chtml/fonts/woff-v2/MathJax_Zero.woff",
|
||||
"mathjax/es5/tex-chtml.js",
|
||||
// these change to '.json' in the latest mathjax
|
||||
"mathjax/es5/sre/mathmaps/de.js",
|
||||
"mathjax/es5/sre/mathmaps/en.js",
|
||||
"mathjax/es5/sre/mathmaps/es.js",
|
||||
"mathjax/es5/sre/mathmaps/fr.js",
|
||||
"mathjax/es5/sre/mathmaps/hi.js",
|
||||
"mathjax/es5/sre/mathmaps/it.js",
|
||||
"mathjax/es5/sre/mathmaps/nemeth.js",
|
||||
];
|
||||
|
||||
pub fn copy_mathjax() -> impl BuildAction {
|
||||
RsyncFiles {
|
||||
inputs: inputs![":node_modules:mathjax"],
|
||||
target_folder: "qt/_aqt/data/web/js/vendor/mathjax",
|
||||
strip_prefix: "$builddir/node_modules/mathjax/es5",
|
||||
extra_args: "",
|
||||
}
|
||||
}
|
||||
|
||||
fn build_sass(build: &mut Build) -> Result<()> {
|
||||
build.add_inputs_to_group("sass", inputs![glob!("sass/**")]);
|
||||
|
||||
build.add(
|
||||
"css:_root-vars",
|
||||
CompileSass {
|
||||
input: inputs!["sass/_root-vars.scss"],
|
||||
output: "sass/_root-vars.css",
|
||||
deps: inputs![glob!["sass/*"]],
|
||||
load_paths: vec![],
|
||||
},
|
||||
)?;
|
||||
|
||||
Ok(())
|
||||
}
|
19
build/ninja_gen/Cargo.toml
Normal file
19
build/ninja_gen/Cargo.toml
Normal file
@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "ninja_gen"
|
||||
|
||||
version.workspace = true
|
||||
authors.workspace = true
|
||||
license.workspace = true
|
||||
edition.workspace = true
|
||||
rust-version.workspace = true
|
||||
|
||||
[dependencies]
|
||||
camino = "1.1.1"
|
||||
globset = "0.4.9"
|
||||
itertools = "0.10.5"
|
||||
lazy_static = "1.4.0"
|
||||
maplit = "1.0.2"
|
||||
num_cpus = "1.14.0"
|
||||
walkdir = "2.3.2"
|
||||
which = "4.3.0"
|
||||
workspace-hack = { version = "0.1", path = "../../tools/workspace-hack" }
|
42
build/ninja_gen/src/action.rs
Normal file
42
build/ninja_gen/src/action.rs
Normal file
@ -0,0 +1,42 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use crate::{build::FilesHandle, Build, Result};
|
||||
|
||||
pub trait BuildAction {
|
||||
/// Command line to invoke for each build statement.
|
||||
fn command(&self) -> &str;
|
||||
|
||||
/// Declare the input files and variables, and output files.
|
||||
fn files(&mut self, build: &mut impl FilesHandle);
|
||||
|
||||
/// If true, this action will not trigger a rebuild of dependent targets if the output
|
||||
/// files are unchanged. This corresponds to Ninja's "restat" argument.
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
/// True if this rule generates build.ninja
|
||||
fn generator(&self) -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
/// Called on first action invocation; can be used to inject other build actions
|
||||
/// to perform initial setup.
|
||||
#[allow(unused_variables)]
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn concurrency_pool(&self) -> Option<&'static str> {
|
||||
None
|
||||
}
|
||||
|
||||
fn bypass_runner(&self) -> bool {
|
||||
false
|
||||
}
|
||||
|
||||
fn name(&self) -> &'static str {
|
||||
std::any::type_name::<Self>().split("::").last().unwrap()
|
||||
}
|
||||
}
|
200
build/ninja_gen/src/archives.rs
Normal file
200
build/ninja_gen/src/archives.rs
Normal file
@ -0,0 +1,200 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{borrow::Cow, collections::HashMap};
|
||||
|
||||
use camino::{Utf8Path, Utf8PathBuf};
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
cargo::{CargoBuild, RustOutput},
|
||||
glob,
|
||||
input::BuildInput,
|
||||
inputs, Build, Result,
|
||||
};
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct OnlineArchive {
|
||||
pub url: &'static str,
|
||||
pub sha256: &'static str,
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
|
||||
pub enum Platform {
|
||||
LinuxX64,
|
||||
LinuxArm,
|
||||
MacX64,
|
||||
MacArm,
|
||||
WindowsX64,
|
||||
}
|
||||
|
||||
impl Platform {
|
||||
pub fn current() -> Self {
|
||||
if cfg!(windows) {
|
||||
Self::WindowsX64
|
||||
} else {
|
||||
let os = std::env::consts::OS;
|
||||
let arch = std::env::consts::ARCH;
|
||||
match (os, arch) {
|
||||
("linux", "x86_64") => Self::LinuxX64,
|
||||
("linux", "aarch64") => Self::LinuxArm,
|
||||
("macos", "x86_64") => Self::MacX64,
|
||||
("macos", "aarch64") => Self::MacArm,
|
||||
_ => panic!("unsupported os/arch {os} {arch} - PR welcome!"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn tls_feature() -> &'static str {
|
||||
match Self::current() {
|
||||
// On Linux, wheels are not allowed to link to OpenSSL, and linking setup
|
||||
// caused pain for AnkiDroid in the past. On other platforms, we stick to
|
||||
// native libraries, for smaller binaries.
|
||||
Platform::LinuxX64 | Platform::LinuxArm => "rustls",
|
||||
_ => "native-tls",
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_rust_triple(&self) -> &'static str {
|
||||
match self {
|
||||
Platform::MacX64 => "x86_64-apple-darwin",
|
||||
_ => unimplemented!(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Append .exe to path if on Windows.
|
||||
pub fn with_exe(path: &str) -> Cow<str> {
|
||||
if cfg!(windows) {
|
||||
format!("{path}.exe").into()
|
||||
} else {
|
||||
path.into()
|
||||
}
|
||||
}
|
||||
|
||||
struct DownloadArchive {
|
||||
pub archive: OnlineArchive,
|
||||
}
|
||||
|
||||
impl BuildAction for DownloadArchive {
|
||||
fn command(&self) -> &str {
|
||||
"$archives_bin download $url $checksum $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
let (_, filename) = self.archive.url.rsplit_once('/').unwrap();
|
||||
let output_path = Utf8Path::new("download").join(filename);
|
||||
|
||||
build.add_inputs("archives_bin", inputs![":build:archives"]);
|
||||
build.add_variable("url", self.archive.url);
|
||||
build.add_variable("checksum", self.archive.sha256);
|
||||
build.add_outputs("out", &[output_path.into_string()])
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build_archive_tool(build)
|
||||
}
|
||||
}
|
||||
|
||||
struct ExtractArchive<'a, I> {
|
||||
pub archive_path: BuildInput,
|
||||
/// The folder that the archive should be extracted into, relative to $builddir/extracted.
|
||||
/// If the archive contains a single top-level folder, its contents will be extracted into the
|
||||
/// provided folder, so that output like tool-1.2/ can be extracted into tool/.
|
||||
pub extraction_folder_name: &'a str,
|
||||
/// Files contained inside the archive, relative to the archive root, and excluding the top-level
|
||||
/// folder if it is the sole top-level entry. Any files you wish to use as part of subsequent rules
|
||||
/// must be declared here.
|
||||
pub file_manifest: HashMap<&'static str, I>,
|
||||
}
|
||||
|
||||
impl<I> ExtractArchive<'_, I> {
|
||||
fn extraction_folder(&self) -> Utf8PathBuf {
|
||||
Utf8Path::new("$builddir")
|
||||
.join("extracted")
|
||||
.join(self.extraction_folder_name)
|
||||
}
|
||||
}
|
||||
|
||||
impl<I> BuildAction for ExtractArchive<'_, I>
|
||||
where
|
||||
I: IntoIterator,
|
||||
I::Item: AsRef<str>,
|
||||
{
|
||||
fn command(&self) -> &str {
|
||||
"$archive_tool extract $in $extraction_folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("archive_tool", inputs![":build:archives"]);
|
||||
build.add_inputs("in", inputs![self.archive_path.clone()]);
|
||||
|
||||
let folder = self.extraction_folder();
|
||||
build.add_variable("extraction_folder", folder.to_string());
|
||||
for (subgroup, files) in self.file_manifest.drain() {
|
||||
build.add_outputs_ext(
|
||||
subgroup,
|
||||
files
|
||||
.into_iter()
|
||||
.map(|f| folder.join(f.as_ref()).to_string()),
|
||||
!subgroup.is_empty(),
|
||||
);
|
||||
}
|
||||
build.add_output_stamp(folder.with_extension("marker"));
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build_archive_tool(build)
|
||||
}
|
||||
|
||||
fn name(&self) -> &'static str {
|
||||
"extract"
|
||||
}
|
||||
}
|
||||
|
||||
fn build_archive_tool(build: &mut Build) -> Result<()> {
|
||||
build.once_only("build_archive_tool", |build| {
|
||||
let features = Platform::tls_feature();
|
||||
build.add(
|
||||
"build:archives",
|
||||
CargoBuild {
|
||||
inputs: inputs![glob!("build/archives/**/*")],
|
||||
outputs: &[RustOutput::Binary("archives")],
|
||||
target: None,
|
||||
extra_args: &format!("-p archives --features {features}"),
|
||||
release_override: Some(false),
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
|
||||
/// See [DownloadArchive] and [ExtractArchive].
|
||||
pub fn download_and_extract<I>(
|
||||
build: &mut Build,
|
||||
group_name: &str,
|
||||
archive: OnlineArchive,
|
||||
file_manifest: HashMap<&'static str, I>,
|
||||
) -> Result<()>
|
||||
where
|
||||
I: IntoIterator,
|
||||
I::Item: AsRef<str>,
|
||||
{
|
||||
let download_group = format!("download:{group_name}");
|
||||
build.add(&download_group, DownloadArchive { archive })?;
|
||||
|
||||
let extract_group = format!("extract:{group_name}");
|
||||
build.add(
|
||||
extract_group,
|
||||
ExtractArchive {
|
||||
archive_path: inputs![format!(":{download_group}")],
|
||||
extraction_folder_name: group_name,
|
||||
file_manifest,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn empty_manifest() -> HashMap<&'static str, &'static [&'static str]> {
|
||||
Default::default()
|
||||
}
|
475
build/ninja_gen/src/build.rs
Normal file
475
build/ninja_gen/src/build.rs
Normal file
@ -0,0 +1,475 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{
|
||||
collections::{HashMap, HashSet},
|
||||
fmt::Write,
|
||||
};
|
||||
|
||||
use camino::Utf8PathBuf;
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
archives::Platform,
|
||||
configure::ConfigureBuild,
|
||||
input::{space_separated, BuildInput},
|
||||
Result,
|
||||
};
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Build {
|
||||
pub variables: HashMap<&'static str, String>,
|
||||
pub buildroot: Utf8PathBuf,
|
||||
pub release: bool,
|
||||
pub pools: Vec<(&'static str, usize)>,
|
||||
pub trailing_text: String,
|
||||
pub host_platform: Platform,
|
||||
|
||||
pub(crate) output_text: String,
|
||||
action_names: HashSet<&'static str>,
|
||||
pub(crate) groups: HashMap<String, Vec<String>>,
|
||||
}
|
||||
|
||||
impl Build {
|
||||
pub fn new() -> Result<Self> {
|
||||
let buildroot = if cfg!(windows) {
|
||||
Utf8PathBuf::from("out")
|
||||
} else {
|
||||
// on Unix systems we allow out to be a symlink to an external location
|
||||
Utf8PathBuf::from("out").canonicalize_utf8()?
|
||||
};
|
||||
|
||||
let mut build = Build {
|
||||
buildroot,
|
||||
release: std::env::var("RELEASE").is_ok(),
|
||||
host_platform: Platform::current(),
|
||||
variables: Default::default(),
|
||||
pools: Default::default(),
|
||||
trailing_text: Default::default(),
|
||||
output_text: Default::default(),
|
||||
action_names: Default::default(),
|
||||
groups: Default::default(),
|
||||
};
|
||||
|
||||
build.add("build:run_configure", ConfigureBuild {})?;
|
||||
|
||||
Ok(build)
|
||||
}
|
||||
|
||||
pub fn variable(&mut self, name: &'static str, value: impl Into<String>) {
|
||||
self.variables.insert(name, value.into());
|
||||
}
|
||||
|
||||
pub fn pool(&mut self, name: &'static str, size: usize) {
|
||||
self.pools.push((name, size));
|
||||
}
|
||||
|
||||
/// Evaluate the provided closure only once, using `key` to determine uniqueness.
|
||||
/// This key should not match any build action name.
|
||||
pub fn once_only(
|
||||
&mut self,
|
||||
key: &'static str,
|
||||
block: impl FnOnce(&mut Build) -> Result<()>,
|
||||
) -> Result<()> {
|
||||
if self.action_names.insert(key) {
|
||||
block(self)
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
pub fn add(&mut self, group: impl AsRef<str>, action: impl BuildAction) -> Result<()> {
|
||||
let group = group.as_ref();
|
||||
let groups = split_groups(group);
|
||||
let group = groups[0];
|
||||
let command = action.command();
|
||||
|
||||
let action_name = action.name();
|
||||
// first invocation?
|
||||
let mut first_invocation = false;
|
||||
self.once_only(action_name, |build| {
|
||||
action.on_first_instance(build)?;
|
||||
first_invocation = true;
|
||||
Ok(())
|
||||
})?;
|
||||
|
||||
let action_name = action_name.to_string();
|
||||
|
||||
// ensure separator is delivered to runner, not shell
|
||||
let command = if cfg!(windows) || action.bypass_runner() {
|
||||
command.into()
|
||||
} else {
|
||||
command.replace("&&", "\"&&\"")
|
||||
};
|
||||
|
||||
let mut statement =
|
||||
BuildStatement::from_build_action(group, action, &self.groups, self.release);
|
||||
|
||||
if first_invocation {
|
||||
let command = statement.prepare_command(command);
|
||||
writeln!(
|
||||
&mut self.output_text,
|
||||
"\
|
||||
rule {action_name}
|
||||
command = {command}",
|
||||
)
|
||||
.unwrap();
|
||||
for (k, v) in &statement.rule_variables {
|
||||
writeln!(&mut self.output_text, " {k} = {v}").unwrap();
|
||||
}
|
||||
self.output_text.push('\n');
|
||||
}
|
||||
|
||||
let (all_outputs, subgroups) = statement.render_into(&mut self.output_text);
|
||||
for group in groups {
|
||||
self.add_resolved_files_to_group(group, &all_outputs);
|
||||
}
|
||||
for (subgroup, outputs) in subgroups {
|
||||
let group_with_subgroup = format!("{group}:{subgroup}");
|
||||
self.add_resolved_files_to_group(&group_with_subgroup, &outputs);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Add one or more resolved files to a group.
|
||||
pub fn add_resolved_files_to_group<'a>(
|
||||
&mut self,
|
||||
group: &str,
|
||||
files: impl IntoIterator<Item = &'a String>,
|
||||
) {
|
||||
let buf = self.groups.entry(group.to_owned()).or_default();
|
||||
buf.extend(files.into_iter().map(ToString::to_string));
|
||||
}
|
||||
|
||||
pub fn add_inputs_to_group(&mut self, group: &str, inputs: BuildInput) {
|
||||
self.add_resolved_files_to_group(group, &self.expand_inputs(inputs));
|
||||
}
|
||||
|
||||
/// Group names should not have a leading `:`.
|
||||
pub fn add_group_to_group(&mut self, target_group: &str, additional_group: &str) {
|
||||
let additional_files = self
|
||||
.groups
|
||||
.get(additional_group)
|
||||
.unwrap_or_else(|| panic!("{additional_group} had no files"));
|
||||
self.add_resolved_files_to_group(target_group, &additional_files.clone())
|
||||
}
|
||||
|
||||
/// Outputs from a given build statement group. An error if no files have been registered yet.
|
||||
pub fn group_outputs(&self, group_name: &'static str) -> &[String] {
|
||||
self.groups
|
||||
.get(group_name)
|
||||
.unwrap_or_else(|| panic!("expected files in {group_name}"))
|
||||
}
|
||||
|
||||
/// Single output from a given build statement group. An error if no files have been registered yet,
|
||||
/// or more than one file has been registered.
|
||||
pub fn group_output(&self, group_name: &'static str) -> String {
|
||||
let outputs = self.group_outputs(group_name);
|
||||
assert_eq!(outputs.len(), 1);
|
||||
outputs.first().unwrap().into()
|
||||
}
|
||||
|
||||
pub fn expand_inputs(&self, inputs: impl AsRef<BuildInput>) -> Vec<String> {
|
||||
expand_inputs(inputs, &self.groups)
|
||||
}
|
||||
|
||||
/// Expand inputs, the return a filtered subset.
|
||||
pub fn filter_inputs<F>(&self, inputs: impl AsRef<BuildInput>, func: F) -> Vec<String>
|
||||
where
|
||||
F: FnMut(&String) -> bool,
|
||||
{
|
||||
self.expand_inputs(inputs)
|
||||
.into_iter()
|
||||
.filter(func)
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub fn inputs_with_suffix(&self, inputs: impl AsRef<BuildInput>, ext: &str) -> Vec<String> {
|
||||
self.filter_inputs(inputs, |f| f.ends_with(ext))
|
||||
}
|
||||
}
|
||||
|
||||
fn split_groups(group: &str) -> Vec<&str> {
|
||||
let mut rest = group;
|
||||
let mut groups = vec![group];
|
||||
while let Some((head, _tail)) = rest.rsplit_once(':') {
|
||||
groups.push(head);
|
||||
rest = head;
|
||||
}
|
||||
groups
|
||||
}
|
||||
|
||||
struct BuildStatement<'a> {
|
||||
/// Cache of outputs by already-evaluated build rules, allowing later rules to more easily consume
|
||||
/// the outputs of previous rules.
|
||||
existing_outputs: &'a HashMap<String, Vec<String>>,
|
||||
rule_name: &'static str,
|
||||
// implicit refers to files that are not automatically assigned to $in and $out by Ninja,
|
||||
implicit_inputs: Vec<String>,
|
||||
implicit_outputs: Vec<String>,
|
||||
explicit_inputs: Vec<String>,
|
||||
explicit_outputs: Vec<String>,
|
||||
output_subsets: Vec<(String, Vec<String>)>,
|
||||
variables: Vec<(String, String)>,
|
||||
rule_variables: Vec<(String, String)>,
|
||||
output_stamp: bool,
|
||||
env_vars: Vec<String>,
|
||||
release: bool,
|
||||
bypass_runner: bool,
|
||||
}
|
||||
|
||||
impl BuildStatement<'_> {
|
||||
fn from_build_action<'a>(
|
||||
group: &str,
|
||||
mut action: impl BuildAction,
|
||||
existing_outputs: &'a HashMap<String, Vec<String>>,
|
||||
release: bool,
|
||||
) -> BuildStatement<'a> {
|
||||
let mut stmt = BuildStatement {
|
||||
existing_outputs,
|
||||
rule_name: action.name(),
|
||||
implicit_inputs: Default::default(),
|
||||
implicit_outputs: Default::default(),
|
||||
explicit_inputs: Default::default(),
|
||||
explicit_outputs: Default::default(),
|
||||
variables: Default::default(),
|
||||
rule_variables: Default::default(),
|
||||
output_subsets: Default::default(),
|
||||
output_stamp: false,
|
||||
env_vars: Default::default(),
|
||||
release,
|
||||
bypass_runner: action.bypass_runner(),
|
||||
};
|
||||
action.files(&mut stmt);
|
||||
|
||||
if stmt.explicit_outputs.is_empty() && stmt.implicit_outputs.is_empty() {
|
||||
panic!("{} must generate at least one output", action.name());
|
||||
}
|
||||
stmt.variables.push(("description".into(), group.into()));
|
||||
if action.check_output_timestamps() {
|
||||
stmt.rule_variables.push(("restat".into(), "1".into()));
|
||||
}
|
||||
if action.generator() {
|
||||
stmt.rule_variables.push(("generator".into(), "1".into()));
|
||||
}
|
||||
if let Some(pool) = action.concurrency_pool() {
|
||||
stmt.rule_variables.push(("pool".into(), pool.into()));
|
||||
}
|
||||
|
||||
stmt
|
||||
}
|
||||
|
||||
/// Returns a list of all output files, which `Build` will add to `existing_outputs`,
|
||||
/// and any subgroups.
|
||||
fn render_into(mut self, buf: &mut String) -> (Vec<String>, Vec<(String, Vec<String>)>) {
|
||||
let action_name = self.rule_name;
|
||||
let inputs_str = to_ninja_target_string(&self.explicit_inputs, &self.implicit_inputs);
|
||||
let outputs_str = to_ninja_target_string(&self.explicit_outputs, &self.implicit_outputs);
|
||||
|
||||
writeln!(buf, "build {outputs_str}: {action_name} {inputs_str}").unwrap();
|
||||
for (key, value) in self.variables {
|
||||
writeln!(buf, " {key} = {}", value).unwrap();
|
||||
}
|
||||
writeln!(buf).unwrap();
|
||||
|
||||
let outputs_vec = {
|
||||
self.implicit_outputs.extend(self.explicit_outputs);
|
||||
self.implicit_outputs
|
||||
};
|
||||
(outputs_vec, self.output_subsets)
|
||||
}
|
||||
|
||||
fn prepare_command(&mut self, command: String) -> String {
|
||||
if self.bypass_runner {
|
||||
return command;
|
||||
}
|
||||
if command.starts_with("$runner") {
|
||||
self.implicit_inputs.push("$runner".into());
|
||||
return command;
|
||||
}
|
||||
let mut buf = String::from("$runner run ");
|
||||
if self.output_stamp {
|
||||
write!(&mut buf, "--stamp=$stamp ").unwrap();
|
||||
}
|
||||
if !self.env_vars.is_empty() {
|
||||
for var in &self.env_vars {
|
||||
write!(&mut buf, "--env={var} ").unwrap();
|
||||
}
|
||||
}
|
||||
buf.push_str(&command);
|
||||
buf
|
||||
}
|
||||
}
|
||||
|
||||
fn expand_inputs(
|
||||
input: impl AsRef<BuildInput>,
|
||||
existing_outputs: &HashMap<String, Vec<String>>,
|
||||
) -> Vec<String> {
|
||||
let mut vec = vec![];
|
||||
input.as_ref().add_to_vec(&mut vec, existing_outputs);
|
||||
vec
|
||||
}
|
||||
|
||||
pub trait FilesHandle {
|
||||
/// Add inputs to the build statement. Can be called multiple times with
|
||||
/// different variables. This is a shortcut for calling .expand_inputs()
|
||||
/// and then .add_inputs_vec()
|
||||
/// - If the variable name is non-empty, a variable of the same name will be
|
||||
/// created so the file list can be accessed in the command. By convention,
|
||||
/// this is often `in`.
|
||||
fn add_inputs(&mut self, variable: &'static str, inputs: impl AsRef<BuildInput>);
|
||||
fn add_inputs_vec(&mut self, variable: &'static str, inputs: Vec<String>);
|
||||
|
||||
/// Add a variable that can be referenced in the command.
|
||||
fn add_variable(&mut self, name: impl Into<String>, value: impl Into<String>);
|
||||
|
||||
fn expand_input(&self, input: &BuildInput) -> String;
|
||||
fn expand_inputs(&self, inputs: impl AsRef<BuildInput>) -> Vec<String>;
|
||||
|
||||
/// Like [FilesHandle::add_outputs_ext], without adding a subgroup.
|
||||
fn add_outputs(
|
||||
&mut self,
|
||||
variable: &'static str,
|
||||
outputs: impl IntoIterator<Item = impl AsRef<str>>,
|
||||
) {
|
||||
self.add_outputs_ext(variable, outputs, false);
|
||||
}
|
||||
|
||||
/// Add outputs to the build statement. Can be called multiple times with
|
||||
/// different variables.
|
||||
/// - Each output automatically has $builddir/ prefixed to it if it does not
|
||||
/// already start with it.
|
||||
/// - If the variable name is non-empty, a variable of the same name will be
|
||||
/// created so the file list can be accessed in the command. By convention,
|
||||
/// this is often `out`.
|
||||
/// - If subgroup is true, the files are also placed in a subgroup. Eg
|
||||
/// if a rule `foo` exists and subgroup `bar` is provided, the files are accessible
|
||||
/// via `:foo:bar`. The variable name must not be empty, or called `out`.
|
||||
fn add_outputs_ext(
|
||||
&mut self,
|
||||
variable: impl Into<String>,
|
||||
outputs: impl IntoIterator<Item = impl AsRef<str>>,
|
||||
subgroup: bool,
|
||||
);
|
||||
|
||||
/// Save an output stamp if the command completes successfully.
|
||||
fn add_output_stamp(&mut self, path: impl Into<String>);
|
||||
/// Set an env var for the duration of the provided command(s).
|
||||
/// Note this is defined once for the rule, so if the value should change
|
||||
/// for each command, `constant_value` should reference a `$variable` you have
|
||||
/// defined.
|
||||
fn add_env_var(&mut self, key: &str, constant_value: &str);
|
||||
|
||||
fn release_build(&self) -> bool;
|
||||
}
|
||||
|
||||
impl FilesHandle for BuildStatement<'_> {
|
||||
fn add_inputs(&mut self, variable: &'static str, inputs: impl AsRef<BuildInput>) {
|
||||
self.add_inputs_vec(variable, FilesHandle::expand_inputs(self, inputs));
|
||||
}
|
||||
|
||||
fn add_inputs_vec(&mut self, variable: &'static str, inputs: Vec<String>) {
|
||||
match variable {
|
||||
"in" => self.explicit_inputs.extend(inputs),
|
||||
other_key => {
|
||||
if !other_key.is_empty() {
|
||||
self.add_variable(other_key, space_separated(&inputs));
|
||||
}
|
||||
self.implicit_inputs.extend(inputs);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn add_variable(&mut self, key: impl Into<String>, value: impl Into<String>) {
|
||||
self.variables.push((key.into(), value.into()));
|
||||
}
|
||||
|
||||
fn expand_input(&self, input: &BuildInput) -> String {
|
||||
let mut vec = Vec::with_capacity(1);
|
||||
input.add_to_vec(&mut vec, self.existing_outputs);
|
||||
if vec.len() != 1 {
|
||||
panic!("expected {input:?} to resolve to a single file; got ${vec:?}");
|
||||
}
|
||||
vec.pop().unwrap()
|
||||
}
|
||||
|
||||
fn add_outputs_ext(
|
||||
&mut self,
|
||||
variable: impl Into<String>,
|
||||
outputs: impl IntoIterator<Item = impl AsRef<str>>,
|
||||
subgroup: bool,
|
||||
) {
|
||||
let outputs = outputs.into_iter().map(|v| {
|
||||
let v = v.as_ref();
|
||||
let v = if !v.starts_with("$builddir/") && !v.starts_with("$builddir\\") {
|
||||
format!("$builddir/{}", v)
|
||||
} else {
|
||||
v.to_owned()
|
||||
};
|
||||
if cfg!(windows) {
|
||||
v.replace('/', "\\")
|
||||
} else {
|
||||
v
|
||||
}
|
||||
});
|
||||
let variable = variable.into();
|
||||
match variable.as_str() {
|
||||
"out" => self.explicit_outputs.extend(outputs),
|
||||
other_key => {
|
||||
let outputs: Vec<_> = outputs.collect();
|
||||
if !other_key.is_empty() {
|
||||
self.add_variable(other_key, space_separated(&outputs));
|
||||
}
|
||||
if subgroup {
|
||||
assert!(!other_key.is_empty());
|
||||
self.output_subsets
|
||||
.push((other_key.to_owned(), outputs.to_owned()));
|
||||
}
|
||||
|
||||
self.implicit_outputs.extend(outputs);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn expand_inputs(&self, inputs: impl AsRef<BuildInput>) -> Vec<String> {
|
||||
expand_inputs(inputs, self.existing_outputs)
|
||||
}
|
||||
|
||||
fn release_build(&self) -> bool {
|
||||
self.release
|
||||
}
|
||||
|
||||
fn add_output_stamp(&mut self, path: impl Into<String>) {
|
||||
self.output_stamp = true;
|
||||
self.add_outputs("stamp", vec![path.into()]);
|
||||
}
|
||||
|
||||
fn add_env_var(&mut self, key: &str, constant_value: &str) {
|
||||
self.env_vars.push(format!("{key}={constant_value}"));
|
||||
}
|
||||
}
|
||||
|
||||
fn to_ninja_target_string(explicit: &[String], implicit: &[String]) -> String {
|
||||
let mut joined = space_separated(explicit);
|
||||
if !implicit.is_empty() {
|
||||
joined.push_str(" | ");
|
||||
joined.push_str(&space_separated(implicit));
|
||||
}
|
||||
joined
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_split_groups() {
|
||||
assert_eq!(&split_groups("foo"), &["foo"]);
|
||||
assert_eq!(&split_groups("foo:bar"), &["foo:bar", "foo"]);
|
||||
assert_eq!(
|
||||
&split_groups("foo:bar:baz"),
|
||||
&["foo:bar:baz", "foo:bar", "foo"]
|
||||
);
|
||||
}
|
||||
}
|
236
build/ninja_gen/src/cargo.rs
Normal file
236
build/ninja_gen/src/cargo.rs
Normal file
@ -0,0 +1,236 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use camino::{Utf8Path, Utf8PathBuf};
|
||||
|
||||
use crate::{
|
||||
action::BuildAction, archives::with_exe, build::FilesHandle, input::BuildInput, inputs, Build,
|
||||
Result,
|
||||
};
|
||||
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub enum RustOutput<'a> {
|
||||
Binary(&'a str),
|
||||
StaticLib(&'a str),
|
||||
DynamicLib(&'a str),
|
||||
/// (group_name, fully qualified path)
|
||||
Data(&'a str, &'a str),
|
||||
}
|
||||
|
||||
impl RustOutput<'_> {
|
||||
pub fn name(&self) -> &str {
|
||||
match self {
|
||||
RustOutput::Binary(pkg) => pkg,
|
||||
RustOutput::StaticLib(pkg) => pkg,
|
||||
RustOutput::DynamicLib(pkg) => pkg,
|
||||
RustOutput::Data(name, _) => name,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn path(&self, rust_base: &Utf8Path, target: Option<&str>, release: bool) -> String {
|
||||
let filename = match *self {
|
||||
RustOutput::Binary(package) => {
|
||||
if cfg!(windows) {
|
||||
format!("{package}.exe")
|
||||
} else {
|
||||
package.into()
|
||||
}
|
||||
}
|
||||
RustOutput::StaticLib(package) => format!("lib{package}.a"),
|
||||
RustOutput::DynamicLib(package) => {
|
||||
if cfg!(windows) {
|
||||
format!("{package}.dll")
|
||||
} else if cfg!(target_os = "macos") {
|
||||
format!("lib{package}.dylib")
|
||||
} else {
|
||||
format!("lib{package}.so")
|
||||
}
|
||||
}
|
||||
RustOutput::Data(_, path) => return path.to_string(),
|
||||
};
|
||||
let mut path: Utf8PathBuf = rust_base.into();
|
||||
if let Some(target) = target {
|
||||
path = path.join(target);
|
||||
}
|
||||
path = path
|
||||
.join(if release { "release" } else { "debug" })
|
||||
.join(filename);
|
||||
path.to_string()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
pub struct CargoBuild<'a> {
|
||||
pub inputs: BuildInput,
|
||||
pub outputs: &'a [RustOutput<'a>],
|
||||
pub target: Option<&'static str>,
|
||||
pub extra_args: &'a str,
|
||||
pub release_override: Option<bool>,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoBuild<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"cargo build $release_arg $target_arg $cargo_flags $extra_args"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
let release_build = self
|
||||
.release_override
|
||||
.unwrap_or_else(|| build.release_build());
|
||||
let release_arg = if release_build { "--release" } else { "" };
|
||||
let target_arg = if let Some(target) = self.target {
|
||||
format!("--target {target}")
|
||||
} else {
|
||||
"".into()
|
||||
};
|
||||
|
||||
build.add_inputs("", &self.inputs);
|
||||
build.add_inputs(
|
||||
"",
|
||||
inputs![".cargo/config.toml", "rust-toolchain.toml", "Cargo.lock"],
|
||||
);
|
||||
build.add_variable("release_arg", release_arg);
|
||||
build.add_variable("target_arg", target_arg);
|
||||
build.add_variable("extra_args", self.extra_args);
|
||||
|
||||
let output_root = Utf8Path::new("$builddir/rust");
|
||||
for output in self.outputs {
|
||||
let name = output.name();
|
||||
let path = output.path(output_root, self.target, release_build);
|
||||
build.add_outputs_ext(name, vec![path], true);
|
||||
}
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
setup_flags(build)
|
||||
}
|
||||
}
|
||||
|
||||
fn setup_flags(build: &mut Build) -> Result<()> {
|
||||
build.once_only("cargo_flags_and_pool", |build| {
|
||||
build.variable("cargo_flags", "--locked");
|
||||
Ok(())
|
||||
})
|
||||
}
|
||||
|
||||
pub struct CargoTest {
|
||||
pub inputs: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoTest {
|
||||
fn command(&self) -> &str {
|
||||
"cargo nextest run --color=always --failure-output=final --status-level=none $cargo_flags"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs("", &self.inputs);
|
||||
build.add_inputs("", inputs![":cargo-nextest"]);
|
||||
build.add_output_stamp("tests/cargo_test");
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"cargo-nextest",
|
||||
CargoInstall {
|
||||
binary_name: "cargo-nextest",
|
||||
args: "cargo-nextest --version 0.9.43",
|
||||
},
|
||||
)?;
|
||||
setup_flags(build)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct CargoClippy {
|
||||
pub inputs: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoClippy {
|
||||
fn command(&self) -> &str {
|
||||
"cargo clippy $cargo_flags --tests -- -Dclippy::dbg_macro -Dwarnings"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs(
|
||||
"",
|
||||
inputs![&self.inputs, "Cargo.lock", "rust-toolchain.toml"],
|
||||
);
|
||||
build.add_output_stamp("tests/cargo_clippy");
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
setup_flags(build)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct CargoFormat {
|
||||
pub inputs: BuildInput,
|
||||
pub check_only: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoFormat {
|
||||
fn command(&self) -> &str {
|
||||
// the empty config file prevents warnings about nightly features
|
||||
"cargo fmt $mode -- --config-path=.rustfmt-empty.toml --color always"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs("", &self.inputs);
|
||||
build.add_variable("mode", if self.check_only { "--check" } else { "" });
|
||||
build.add_output_stamp(format!(
|
||||
"tests/cargo_format.{}",
|
||||
if self.check_only { "check" } else { "fmt" }
|
||||
));
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
setup_flags(build)
|
||||
}
|
||||
}
|
||||
|
||||
/// Use Cargo to download and build a Rust binary. If `binary_name` is `foo`, a `$foo` variable
|
||||
/// will be defined with the path to the binary.
|
||||
pub struct CargoInstall {
|
||||
pub binary_name: &'static str,
|
||||
/// eg 'foo --version 1.3' or '--git git://...'
|
||||
pub args: &'static str,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoInstall {
|
||||
fn command(&self) -> &str {
|
||||
"cargo install --color always $args --root $builddir"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_variable("args", self.args);
|
||||
build.add_outputs("", vec![with_exe(&format!("bin/{}", self.binary_name))])
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
pub struct CargoRun {
|
||||
pub binary_name: &'static str,
|
||||
pub cargo_args: &'static str,
|
||||
pub bin_args: &'static str,
|
||||
pub deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for CargoRun {
|
||||
fn command(&self) -> &str {
|
||||
"cargo run --bin $binary $cargo_args -- $bin_args"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_variable("binary", self.binary_name);
|
||||
build.add_variable("cargo_args", self.cargo_args);
|
||||
build.add_variable("bin_args", self.bin_args);
|
||||
build.add_outputs("", vec!["phony"]);
|
||||
}
|
||||
}
|
56
build/ninja_gen/src/command.rs
Normal file
56
build/ninja_gen/src/command.rs
Normal file
@ -0,0 +1,56 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::collections::HashMap;
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
input::{space_separated, BuildInput},
|
||||
inputs,
|
||||
};
|
||||
|
||||
pub struct RunCommand<'a> {
|
||||
// Will be automatically included as a dependency
|
||||
pub command: &'static str,
|
||||
// Arguments to the script, eg `$in $out` or `$in > $out`.
|
||||
pub args: &'a str,
|
||||
pub inputs: HashMap<&'static str, BuildInput>,
|
||||
pub outputs: HashMap<&'static str, Vec<&'a str>>,
|
||||
}
|
||||
|
||||
impl BuildAction for RunCommand<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$cmd $args"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
// Because we've defined a generic rule instead of making one for a specific use case,
|
||||
// we need to manually intepolate variables in the user-provided args.
|
||||
let mut args = self.args.to_string();
|
||||
for (key, inputs) in &self.inputs {
|
||||
let files = build.expand_inputs(inputs);
|
||||
build.add_inputs("", inputs);
|
||||
if !key.is_empty() {
|
||||
args = args.replace(&format!("${key}"), &space_separated(files));
|
||||
}
|
||||
}
|
||||
for (key, outputs) in &self.outputs {
|
||||
if !key.is_empty() {
|
||||
let outputs = outputs.iter().map(|o| {
|
||||
if !o.starts_with("$builddir/") {
|
||||
format!("$builddir/{o}")
|
||||
} else {
|
||||
(*o).into()
|
||||
}
|
||||
});
|
||||
args = args.replace(&format!("${key}"), &space_separated(outputs));
|
||||
}
|
||||
}
|
||||
|
||||
build.add_inputs("cmd", inputs![self.command]);
|
||||
build.add_variable("args", args);
|
||||
for outputs in self.outputs.values() {
|
||||
build.add_outputs("", outputs);
|
||||
}
|
||||
}
|
||||
}
|
43
build/ninja_gen/src/configure.rs
Normal file
43
build/ninja_gen/src/configure.rs
Normal file
@ -0,0 +1,43 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
build::FilesHandle,
|
||||
cargo::{CargoBuild, RustOutput},
|
||||
glob, inputs, Build, Result,
|
||||
};
|
||||
|
||||
pub struct ConfigureBuild {}
|
||||
|
||||
impl BuildAction for ConfigureBuild {
|
||||
fn command(&self) -> &str {
|
||||
"$cmd && ninja -f $builddir/build.ninja -t cleandead"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
build.add_inputs("cmd", inputs![":build:configure"]);
|
||||
// reconfigure when env changes
|
||||
build.add_inputs("", inputs!["$builddir/env"]);
|
||||
build.add_outputs("", ["build.ninja"])
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"build:configure",
|
||||
CargoBuild {
|
||||
inputs: inputs![glob!["build/**/*"]],
|
||||
outputs: &[RustOutput::Binary("configure")],
|
||||
target: None,
|
||||
// we ensure runner is up to date, but don't declare it as output,
|
||||
// as ninja will try to clean up stale outputs, and that fails on
|
||||
// Windows. The ninja wrapper script should ensure the runner is up to
|
||||
// date anyway, but advanced users can invoke ninja directly to save
|
||||
// the ~80+ms it takes cargo to check that the runner is up to date.
|
||||
extra_args: "-p configure -p runner",
|
||||
release_override: Some(false),
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
78
build/ninja_gen/src/copy.rs
Normal file
78
build/ninja_gen/src/copy.rs
Normal file
@ -0,0 +1,78 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use camino::Utf8Path;
|
||||
|
||||
use crate::{action::BuildAction, input::BuildInput};
|
||||
|
||||
/// Copy the provided files into the specified destination folder.
|
||||
/// Directory structure is not preserved - eg foo/bar.js is copied
|
||||
/// into out/$output_folder/bar.js.
|
||||
pub struct CopyFiles<'a> {
|
||||
pub inputs: BuildInput,
|
||||
/// The folder (relative to the build folder) that files should be copied into.
|
||||
pub output_folder: &'a str,
|
||||
}
|
||||
|
||||
impl BuildAction for CopyFiles<'_> {
|
||||
fn command(&self) -> &str {
|
||||
// The -f is because we may need to overwrite read-only files copied from Bazel.
|
||||
"cp -fr $in $builddir/$folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
let inputs = build.expand_inputs(&self.inputs);
|
||||
let output_folder = Utf8Path::new(self.output_folder);
|
||||
let outputs: Vec<_> = inputs
|
||||
.iter()
|
||||
.map(|f| output_folder.join(Utf8Path::new(f).file_name().unwrap()))
|
||||
.collect();
|
||||
build.add_inputs("in", &self.inputs);
|
||||
build.add_outputs("", outputs);
|
||||
build.add_variable("folder", self.output_folder);
|
||||
}
|
||||
}
|
||||
|
||||
/// Copy a single file to the provided output path, which should be relative to
|
||||
/// the output folder. This can be used to create a copy with a different name.
|
||||
pub struct CopyFile<'a> {
|
||||
pub input: BuildInput,
|
||||
pub output: &'a str,
|
||||
}
|
||||
|
||||
impl BuildAction for CopyFile<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"cp $in $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("in", &self.input);
|
||||
build.add_outputs("out", vec![self.output]);
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a symbolic link to the provided output path, which should be relative to
|
||||
/// the output folder. This can be used to create a copy with a different name.
|
||||
pub struct LinkFile<'a> {
|
||||
pub input: BuildInput,
|
||||
pub output: &'a str,
|
||||
}
|
||||
|
||||
impl BuildAction for LinkFile<'_> {
|
||||
fn command(&self) -> &str {
|
||||
if cfg!(windows) {
|
||||
"cmd /c copy $in $out"
|
||||
} else {
|
||||
"ln -sf $in $out"
|
||||
}
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("in", &self.input);
|
||||
build.add_outputs("out", vec![self.output]);
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
71
build/ninja_gen/src/git.rs
Normal file
71
build/ninja_gen/src/git.rs
Normal file
@ -0,0 +1,71 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use itertools::Itertools;
|
||||
|
||||
use super::*;
|
||||
use crate::{action::BuildAction, input::BuildInput};
|
||||
|
||||
pub struct SyncSubmodule {
|
||||
pub path: &'static str,
|
||||
}
|
||||
|
||||
impl BuildAction for SyncSubmodule {
|
||||
fn command(&self) -> &str {
|
||||
"git submodule update --init $path"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
if let Some(head) = locate_git_head() {
|
||||
build.add_inputs("", head);
|
||||
} else {
|
||||
println!("Warning, .git/HEAD not found; submodules may be stale");
|
||||
}
|
||||
build.add_variable("path", self.path);
|
||||
build.add_output_stamp(format!("git/{}", self.path));
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build.pool("git", 1);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn concurrency_pool(&self) -> Option<&'static str> {
|
||||
Some("git")
|
||||
}
|
||||
}
|
||||
|
||||
/// We check the mtime of .git/HEAD to detect when we should sync submodules.
|
||||
/// If this repo is a submodule of another project, .git/HEAD will not exist,
|
||||
/// and we fall back on .git/modules/*/HEAD in a parent folder instead.
|
||||
fn locate_git_head() -> Option<BuildInput> {
|
||||
let standard_path = Utf8Path::new(".git/HEAD");
|
||||
if standard_path.exists() {
|
||||
return Some(inputs![standard_path.to_string()]);
|
||||
}
|
||||
|
||||
let mut folder = Utf8Path::new(".").canonicalize_utf8().unwrap();
|
||||
loop {
|
||||
let path = folder.join(".git").join("modules");
|
||||
if path.exists() {
|
||||
let heads = path
|
||||
.read_dir_utf8()
|
||||
.unwrap()
|
||||
.filter_map(|p| {
|
||||
let head = p.unwrap().path().join("HEAD");
|
||||
if head.exists() {
|
||||
Some(head.to_string())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect_vec();
|
||||
return Some(inputs![heads]);
|
||||
}
|
||||
if let Some(parent) = folder.parent() {
|
||||
folder = parent.to_owned();
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
}
|
13
build/ninja_gen/src/hash.rs
Normal file
13
build/ninja_gen/src/hash.rs
Normal file
@ -0,0 +1,13 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{
|
||||
collections::hash_map::DefaultHasher,
|
||||
hash::{Hash, Hasher},
|
||||
};
|
||||
|
||||
pub fn simple_hash(hashable: impl Hash) -> u64 {
|
||||
let mut hasher = DefaultHasher::new();
|
||||
hashable.hash(&mut hasher);
|
||||
hasher.finish()
|
||||
}
|
186
build/ninja_gen/src/input.rs
Normal file
186
build/ninja_gen/src/input.rs
Normal file
@ -0,0 +1,186 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{collections::HashMap, fmt::Display};
|
||||
|
||||
use camino::Utf8PathBuf;
|
||||
|
||||
#[derive(Debug, Clone, Hash)]
|
||||
pub enum BuildInput {
|
||||
Single(String),
|
||||
Multiple(Vec<String>),
|
||||
Glob(Glob),
|
||||
Inputs(Vec<BuildInput>),
|
||||
Empty,
|
||||
}
|
||||
|
||||
impl AsRef<BuildInput> for BuildInput {
|
||||
fn as_ref(&self) -> &BuildInput {
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for BuildInput {
|
||||
fn default() -> Self {
|
||||
BuildInput::Empty
|
||||
}
|
||||
}
|
||||
|
||||
impl From<String> for BuildInput {
|
||||
fn from(v: String) -> Self {
|
||||
BuildInput::Single(v)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&str> for BuildInput {
|
||||
fn from(v: &str) -> Self {
|
||||
BuildInput::Single(v.to_owned())
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Vec<String>> for BuildInput {
|
||||
fn from(v: Vec<String>) -> Self {
|
||||
BuildInput::Multiple(v)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Glob> for BuildInput {
|
||||
fn from(v: Glob) -> Self {
|
||||
BuildInput::Glob(v)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&BuildInput> for BuildInput {
|
||||
fn from(v: &BuildInput) -> Self {
|
||||
BuildInput::Inputs(vec![v.clone()])
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&[BuildInput]> for BuildInput {
|
||||
fn from(v: &[BuildInput]) -> Self {
|
||||
BuildInput::Inputs(v.to_vec())
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Vec<BuildInput>> for BuildInput {
|
||||
fn from(v: Vec<BuildInput>) -> Self {
|
||||
BuildInput::Inputs(v)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Utf8PathBuf> for BuildInput {
|
||||
fn from(v: Utf8PathBuf) -> Self {
|
||||
BuildInput::Single(v.into_string())
|
||||
}
|
||||
}
|
||||
|
||||
impl BuildInput {
|
||||
pub fn add_to_vec(
|
||||
&self,
|
||||
vec: &mut Vec<String>,
|
||||
exisiting_outputs: &HashMap<String, Vec<String>>,
|
||||
) {
|
||||
let mut resolve_and_add = |value: &str| {
|
||||
if let Some(stripped) = value.strip_prefix(':') {
|
||||
let files = exisiting_outputs.get(stripped).unwrap_or_else(|| {
|
||||
println!("{:?}", &exisiting_outputs);
|
||||
panic!("input referenced {value}, but rule missing/not processed");
|
||||
});
|
||||
for file in files {
|
||||
vec.push(file.into())
|
||||
}
|
||||
} else {
|
||||
vec.push(value.into());
|
||||
}
|
||||
};
|
||||
|
||||
match self {
|
||||
BuildInput::Single(s) => resolve_and_add(s),
|
||||
BuildInput::Multiple(v) => {
|
||||
for item in v {
|
||||
resolve_and_add(item);
|
||||
}
|
||||
}
|
||||
BuildInput::Glob(glob) => {
|
||||
for path in glob.resolve() {
|
||||
vec.push(path.into_string());
|
||||
}
|
||||
}
|
||||
BuildInput::Inputs(inputs) => {
|
||||
for input in inputs {
|
||||
input.add_to_vec(vec, exisiting_outputs)
|
||||
}
|
||||
}
|
||||
BuildInput::Empty => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Hash)]
|
||||
pub struct Glob {
|
||||
pub include: String,
|
||||
pub exclude: Option<String>,
|
||||
}
|
||||
|
||||
lazy_static::lazy_static! {
|
||||
static ref CACHED_FILES: Vec<Utf8PathBuf> = cache_files();
|
||||
}
|
||||
|
||||
/// Walking the source tree once instead of for each glob yields ~4x speed improvements.
|
||||
fn cache_files() -> Vec<Utf8PathBuf> {
|
||||
walkdir::WalkDir::new(".")
|
||||
// ensure the output order is predictable
|
||||
.sort_by_file_name()
|
||||
.into_iter()
|
||||
.filter_entry(move |e| {
|
||||
// don't walk into symlinks, or the top-level out/, or .git
|
||||
!(e.path_is_symlink()
|
||||
|| (e.depth() == 1 && (e.file_name() == "out" || e.file_name() == ".git")))
|
||||
})
|
||||
.filter_map(move |e| {
|
||||
let path = e.as_ref().unwrap().path().strip_prefix("./").unwrap();
|
||||
if !path.is_dir() {
|
||||
Some(Utf8PathBuf::from_path_buf(path.to_owned()).unwrap())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
impl Glob {
|
||||
pub fn resolve(&self) -> impl Iterator<Item = Utf8PathBuf> {
|
||||
let include = globset::GlobBuilder::new(&self.include)
|
||||
.literal_separator(true)
|
||||
.build()
|
||||
.unwrap()
|
||||
.compile_matcher();
|
||||
let exclude = self.exclude.as_ref().map(|glob| {
|
||||
globset::GlobBuilder::new(glob)
|
||||
.literal_separator(true)
|
||||
.build()
|
||||
.unwrap()
|
||||
.compile_matcher()
|
||||
});
|
||||
CACHED_FILES.iter().filter_map(move |path| {
|
||||
if include.is_match(path) {
|
||||
let excluded = exclude
|
||||
.as_ref()
|
||||
.map(|exclude| exclude.is_match(path))
|
||||
.unwrap_or_default();
|
||||
if !excluded {
|
||||
return Some(path.to_owned());
|
||||
}
|
||||
}
|
||||
None
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
pub fn space_separated<I>(iter: I) -> String
|
||||
where
|
||||
I: IntoIterator,
|
||||
I::Item: Display,
|
||||
{
|
||||
itertools::join(iter, " ")
|
||||
}
|
52
build/ninja_gen/src/lib.rs
Normal file
52
build/ninja_gen/src/lib.rs
Normal file
@ -0,0 +1,52 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
pub mod action;
|
||||
pub mod archives;
|
||||
pub mod build;
|
||||
pub mod cargo;
|
||||
pub mod command;
|
||||
pub mod configure;
|
||||
pub mod copy;
|
||||
pub mod git;
|
||||
pub mod hash;
|
||||
pub mod input;
|
||||
pub mod node;
|
||||
pub mod protobuf;
|
||||
pub mod python;
|
||||
pub mod render;
|
||||
pub mod rsync;
|
||||
pub mod sass;
|
||||
|
||||
pub use build::Build;
|
||||
pub use camino::{Utf8Path, Utf8PathBuf};
|
||||
pub use maplit::hashmap;
|
||||
pub use which::which;
|
||||
|
||||
pub type Result<T> = std::result::Result<T, Box<dyn std::error::Error>>;
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! inputs {
|
||||
($($param:expr),+ $(,)?) => {
|
||||
$crate::input::BuildInput::from(vec![$($crate::input::BuildInput::from($param)),+])
|
||||
};
|
||||
() => {
|
||||
$crate::input::BuildInput::Empty
|
||||
};
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! glob {
|
||||
($include:expr) => {
|
||||
$crate::input::Glob {
|
||||
include: $include.into(),
|
||||
exclude: None,
|
||||
}
|
||||
};
|
||||
($include:expr, $exclude:expr) => {
|
||||
$crate::input::Glob {
|
||||
include: $include.into(),
|
||||
exclude: Some($exclude.into()),
|
||||
}
|
||||
};
|
||||
}
|
337
build/ninja_gen/src/node.rs
Normal file
337
build/ninja_gen/src/node.rs
Normal file
@ -0,0 +1,337 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{borrow::Cow, collections::HashMap};
|
||||
|
||||
use super::*;
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
archives::{download_and_extract, OnlineArchive, Platform},
|
||||
hash::simple_hash,
|
||||
input::{space_separated, BuildInput},
|
||||
};
|
||||
|
||||
pub fn node_archive(platform: Platform) -> OnlineArchive {
|
||||
match platform {
|
||||
Platform::LinuxX64 => OnlineArchive {
|
||||
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-linux-x64.tar.xz",
|
||||
sha256: "4481a34bf32ddb9a9ff9540338539401320e8c3628af39929b4211ea3552a19e",
|
||||
},
|
||||
Platform::LinuxArm => OnlineArchive {
|
||||
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-linux-arm64.tar.xz",
|
||||
sha256: "3904869935b7ecc51130b4b86486d2356539a174d11c9181180cab649f32cd2a",
|
||||
},
|
||||
Platform::MacX64 => OnlineArchive {
|
||||
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-darwin-x64.tar.xz",
|
||||
sha256: "6c88d462550a024661e74e9377371d7e023321a652eafb3d14d58a866e6ac002",
|
||||
},
|
||||
Platform::MacArm => OnlineArchive {
|
||||
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-darwin-arm64.tar.xz",
|
||||
sha256: "17f2e25d207d36d6b0964845062160d9ed16207c08d09af33b9a2fd046c5896f",
|
||||
},
|
||||
Platform::WindowsX64 => OnlineArchive {
|
||||
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-win-x64.zip",
|
||||
sha256: "5478a5a2dce2803ae22327a9f8ae8494c1dec4a4beca5bbf897027380aecf4c7",
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub struct YarnSetup {}
|
||||
|
||||
impl BuildAction for YarnSetup {
|
||||
fn command(&self) -> &str {
|
||||
if cfg!(windows) {
|
||||
"corepack.cmd enable yarn"
|
||||
} else {
|
||||
"corepack enable yarn"
|
||||
}
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("", inputs![":extract:node"]);
|
||||
build.add_outputs_ext(
|
||||
"bin",
|
||||
vec![if cfg!(windows) {
|
||||
"extracted/node/yarn.cmd"
|
||||
} else {
|
||||
"extracted/node/bin/yarn"
|
||||
}],
|
||||
true,
|
||||
);
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
pub struct YarnInstall<'a> {
|
||||
pub package_json_and_lock: BuildInput,
|
||||
pub exports: HashMap<&'a str, Vec<Cow<'a, str>>>,
|
||||
}
|
||||
|
||||
impl BuildAction for YarnInstall<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$runner yarn $yarn $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("", &self.package_json_and_lock);
|
||||
build.add_inputs("yarn", inputs![":yarn:bin"]);
|
||||
build.add_outputs("out", vec!["node_modules/.marker"]);
|
||||
for (key, value) in &self.exports {
|
||||
let outputs: Vec<_> = value.iter().map(|o| format!("node_modules/{o}")).collect();
|
||||
build.add_outputs_ext(*key, outputs, true);
|
||||
}
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
fn with_cmd_ext(bin: &str) -> Cow<str> {
|
||||
if cfg!(windows) {
|
||||
format!("{bin}.cmd").into()
|
||||
} else {
|
||||
bin.into()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn setup_node(
|
||||
build: &mut Build,
|
||||
archive: OnlineArchive,
|
||||
binary_exports: &[&'static str],
|
||||
mut data_exports: HashMap<&str, Vec<Cow<str>>>,
|
||||
) -> Result<()> {
|
||||
download_and_extract(
|
||||
build,
|
||||
"node",
|
||||
archive,
|
||||
hashmap! {
|
||||
"bin" => vec![if cfg!(windows) { "node.exe" } else { "bin/node" }],
|
||||
"npm" => vec![if cfg!(windows) { "npm.cmd " } else { "bin/npm" }]
|
||||
},
|
||||
)?;
|
||||
build.add("yarn", YarnSetup {})?;
|
||||
|
||||
for binary in binary_exports {
|
||||
data_exports.insert(
|
||||
*binary,
|
||||
vec![format!(".bin/{}", with_cmd_ext(binary)).into()],
|
||||
);
|
||||
}
|
||||
build.add(
|
||||
"node_modules",
|
||||
YarnInstall {
|
||||
package_json_and_lock: inputs!["yarn.lock", "package.json"],
|
||||
exports: data_exports,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub struct EsbuildScript<'a> {
|
||||
pub script: BuildInput,
|
||||
pub entrypoint: BuildInput,
|
||||
pub deps: BuildInput,
|
||||
/// .js will be appended, and any extra extensions
|
||||
pub output_stem: &'a str,
|
||||
/// eg ['.css', '.html']
|
||||
pub extra_exts: &'a [&'a str],
|
||||
}
|
||||
|
||||
impl BuildAction for EsbuildScript<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$node_bin $script $entrypoint $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("node_bin", inputs![":extract:node:bin"]);
|
||||
build.add_inputs("script", &self.script);
|
||||
build.add_inputs("entrypoint", &self.entrypoint);
|
||||
build.add_inputs("", inputs!["yarn.lock", ":node_modules", &self.deps]);
|
||||
let stem = self.output_stem;
|
||||
let mut outs = vec![format!("{stem}.js")];
|
||||
outs.extend(self.extra_exts.iter().map(|ext| format!("{stem}.{ext}")));
|
||||
build.add_outputs("out", outs);
|
||||
}
|
||||
}
|
||||
|
||||
pub struct DPrint {
|
||||
pub inputs: BuildInput,
|
||||
pub check_only: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for DPrint {
|
||||
fn command(&self) -> &str {
|
||||
"$dprint $mode"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("dprint", inputs![":node_modules:dprint"]);
|
||||
build.add_inputs("", &self.inputs);
|
||||
let mode = if self.check_only { "check" } else { "fmt" };
|
||||
build.add_variable("mode", mode);
|
||||
build.add_output_stamp(format!("tests/dprint.{mode}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct SvelteCheck {
|
||||
pub tsconfig: BuildInput,
|
||||
pub inputs: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for SvelteCheck {
|
||||
fn command(&self) -> &str {
|
||||
"$svelte-check --tsconfig $tsconfig $
|
||||
--fail-on-warnings --threshold warning --use-new-transformation"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("svelte-check", inputs![":node_modules:svelte-check"]);
|
||||
build.add_inputs("tsconfig", &self.tsconfig);
|
||||
build.add_inputs("", &self.inputs);
|
||||
build.add_inputs("", inputs!["yarn.lock"]);
|
||||
let hash = simple_hash(&self.tsconfig);
|
||||
build.add_output_stamp(format!("tests/svelte-check.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct TypescriptCheck {
|
||||
pub tsconfig: BuildInput,
|
||||
pub inputs: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for TypescriptCheck {
|
||||
fn command(&self) -> &str {
|
||||
"$tsc --noEmit -p $tsconfig"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("tsc", inputs![":node_modules:tsc"]);
|
||||
build.add_inputs("tsconfig", &self.tsconfig);
|
||||
build.add_inputs("", &self.inputs);
|
||||
build.add_inputs("", inputs!["yarn.lock"]);
|
||||
let hash = simple_hash(&self.tsconfig);
|
||||
build.add_output_stamp(format!("tests/typescript.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct Eslint<'a> {
|
||||
pub folder: &'a str,
|
||||
pub inputs: BuildInput,
|
||||
pub eslint_rc: BuildInput,
|
||||
pub fix: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for Eslint<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$eslint --max-warnings=0 -c $eslint_rc $fix $folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("eslint", inputs![":node_modules:eslint"]);
|
||||
build.add_inputs("eslint_rc", &self.eslint_rc);
|
||||
build.add_inputs("in", &self.inputs);
|
||||
build.add_inputs("", inputs!["yarn.lock"]);
|
||||
build.add_variable("fix", if self.fix { "--fix" } else { "" });
|
||||
build.add_variable("folder", self.folder);
|
||||
let hash = simple_hash(self.folder);
|
||||
let kind = if self.fix { "fix" } else { "check" };
|
||||
build.add_output_stamp(format!("tests/eslint.{kind}.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct JestTest<'a> {
|
||||
pub folder: &'a str,
|
||||
pub deps: BuildInput,
|
||||
pub jest_rc: BuildInput,
|
||||
pub jsdom: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for JestTest<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$jest --config $config $env $folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("jest", inputs![":node_modules:jest"]);
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_inputs("config", &self.jest_rc);
|
||||
build.add_variable("env", if self.jsdom { "--env=jsdom" } else { "" });
|
||||
build.add_variable("folder", self.folder);
|
||||
let hash = simple_hash(self.folder);
|
||||
build.add_output_stamp(format!("tests/jest.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct SqlFormat {
|
||||
pub inputs: BuildInput,
|
||||
pub check_only: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for SqlFormat {
|
||||
fn command(&self) -> &str {
|
||||
"$tsx $sql_format $mode $in"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("tsx", inputs![":node_modules:tsx"]);
|
||||
build.add_inputs("sql_format", inputs!["ts/sql_format/sql_format.ts"]);
|
||||
build.add_inputs("in", &self.inputs);
|
||||
let mode = if self.check_only { "check" } else { "fix" };
|
||||
build.add_variable("mode", mode);
|
||||
build.add_output_stamp(format!("tests/sql_format.{mode}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct GenTypescriptProto {
|
||||
pub protos: BuildInput,
|
||||
/// .js and .d.ts will be added to it
|
||||
pub output_stem: &'static str,
|
||||
}
|
||||
|
||||
impl BuildAction for GenTypescriptProto {
|
||||
fn command(&self) -> &str {
|
||||
"$pbjs --target=static-module --wrap=default --force-number --force-message --out=$static $in && $
|
||||
$pbjs --target=json-module --wrap=default --force-number --force-message --out=$js $in && $
|
||||
$pbts --out=$dts $static && $
|
||||
rm $static"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("pbjs", inputs![":node_modules:pbjs"]);
|
||||
build.add_inputs("pbts", inputs![":node_modules:pbts"]);
|
||||
build.add_inputs("in", &self.protos);
|
||||
build.add_inputs("", inputs!["yarn.lock"]);
|
||||
|
||||
let stem = self.output_stem;
|
||||
build.add_variable("static", format!("$builddir/{stem}_static.js"));
|
||||
build.add_outputs("js", vec![format!("{stem}.js")]);
|
||||
build.add_outputs("dts", vec![format!("{stem}.d.ts")]);
|
||||
}
|
||||
}
|
||||
|
||||
pub struct CompileSass<'a> {
|
||||
pub input: BuildInput,
|
||||
pub output: &'a str,
|
||||
pub deps: BuildInput,
|
||||
pub load_paths: Vec<&'a str>,
|
||||
}
|
||||
|
||||
impl BuildAction for CompileSass<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$sass -s compressed $args $in -- $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
||||
build.add_inputs("sass", inputs![":node_modules:sass"]);
|
||||
build.add_inputs("in", &self.input);
|
||||
build.add_inputs("", &self.deps);
|
||||
|
||||
let args = space_separated(self.load_paths.iter().map(|path| format!("-I {path}")));
|
||||
build.add_variable("args", args);
|
||||
|
||||
build.add_outputs("out", vec![self.output]);
|
||||
}
|
||||
}
|
104
build/ninja_gen/src/protobuf.rs
Normal file
104
build/ninja_gen/src/protobuf.rs
Normal file
@ -0,0 +1,104 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use maplit::hashmap;
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
archives::{download_and_extract, with_exe, OnlineArchive, Platform},
|
||||
hash::simple_hash,
|
||||
input::BuildInput,
|
||||
inputs,
|
||||
};
|
||||
|
||||
pub fn protoc_archive(platform: Platform) -> OnlineArchive {
|
||||
match platform {
|
||||
Platform::LinuxX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/protocolbuffers/protobuf/releases/download/v21.8/protoc-21.8-linux-x86_64.zip",
|
||||
sha256: "f90d0dd59065fef94374745627336d622702b67f0319f96cee894d41a974d47a",
|
||||
}
|
||||
}
|
||||
Platform::LinuxArm => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/protocolbuffers/protobuf/releases/download/v21.8/protoc-21.8-linux-aarch_64.zip",
|
||||
sha256: "f3d8eb5839d6186392d8c7b54fbeabbb6fcdd90618a500b77cb2e24faa245cad",
|
||||
}
|
||||
}
|
||||
Platform::MacX64 | Platform::MacArm => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/protocolbuffers/protobuf/releases/download/v21.8/protoc-21.8-osx-universal_binary.zip",
|
||||
sha256: "e3324d3bc2e9bc967a0bec2472e0ec73b26f952c7c87f2403197414f780c3c6c",
|
||||
}
|
||||
}
|
||||
Platform::WindowsX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/protocolbuffers/protobuf/releases/download/v21.8/protoc-21.8-win64.zip",
|
||||
sha256: "3657053024faa439ff5f8c1dd2ee06bac0f9b9a3d660e99944f015a7451e87ec",
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn clang_format_archive(platform: Platform) -> OnlineArchive {
|
||||
match platform {
|
||||
Platform::LinuxX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/ankitects/clang-format-binaries/releases/download/anki-2021-01-09/clang-format_linux_x86_64.zip",
|
||||
sha256: "64060bc4dbca30d0d96aab9344e2783008b16e1cae019a2532f1126ca5ec5449",
|
||||
}
|
||||
}
|
||||
Platform::LinuxArm => {
|
||||
// todo: replace with arm64 binary
|
||||
OnlineArchive {
|
||||
url: "https://github.com/ankitects/clang-format-binaries/releases/download/anki-2021-01-09/clang-format_linux_x86_64.zip",
|
||||
sha256: "64060bc4dbca30d0d96aab9344e2783008b16e1cae019a2532f1126ca5ec5449",
|
||||
}
|
||||
}
|
||||
Platform::MacX64 | Platform::MacArm => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/ankitects/clang-format-binaries/releases/download/anki-2021-01-09/clang-format_macos_x86_64.zip",
|
||||
sha256: "238be68d9478163a945754f06a213483473044f5a004c4125d3d9d8d3556466e",
|
||||
}
|
||||
}
|
||||
Platform::WindowsX64 => {
|
||||
OnlineArchive {
|
||||
url: "https://github.com/ankitects/clang-format-binaries/releases/download/anki-2021-01-09/clang-format_windows_x86_64.zip",
|
||||
sha256: "7d9f6915e3f0fb72407830f0fc37141308d2e6915daba72987a52f309fbeaccc",
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
pub struct ClangFormat {
|
||||
pub inputs: BuildInput,
|
||||
pub check_only: bool,
|
||||
}
|
||||
|
||||
impl BuildAction for ClangFormat {
|
||||
fn command(&self) -> &str {
|
||||
"$clang-format --style=google $args $in"
|
||||
}
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("clang-format", inputs![":extract:clang-format:bin"]);
|
||||
build.add_inputs("in", &self.inputs);
|
||||
let (args, mode) = if self.check_only {
|
||||
("--dry-run -ferror-limit=1 -Werror", "check")
|
||||
} else {
|
||||
("-i", "fix")
|
||||
};
|
||||
build.add_variable("args", args);
|
||||
let hash = simple_hash(&self.inputs);
|
||||
build.add_output_stamp(format!("tests/clang-format.{mode}.{hash}"));
|
||||
}
|
||||
fn on_first_instance(&self, build: &mut crate::Build) -> crate::Result<()> {
|
||||
let binary = with_exe("clang-format");
|
||||
download_and_extract(
|
||||
build,
|
||||
"clang-format",
|
||||
clang_format_archive(build.host_platform),
|
||||
hashmap! {
|
||||
"bin" => [binary]
|
||||
},
|
||||
)
|
||||
}
|
||||
}
|
174
build/ninja_gen/src/python.rs
Normal file
174
build/ninja_gen/src/python.rs
Normal file
@ -0,0 +1,174 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use crate::{action::BuildAction, hash::simple_hash, input::BuildInput, inputs, Build, Result};
|
||||
|
||||
pub struct PythonEnvironment<'a> {
|
||||
pub folder: &'static str,
|
||||
pub base_requirements_txt: BuildInput,
|
||||
pub requirements_txt: BuildInput,
|
||||
pub python_binary: &'a BuildInput,
|
||||
pub extra_binary_exports: &'static [&'static str],
|
||||
}
|
||||
|
||||
impl BuildAction for PythonEnvironment<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$runner pyenv $python_binary $builddir/$pyenv_folder $system_pkgs $base_requirements $requirements"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
let bin_path = |binary: &str| -> Vec<String> {
|
||||
let folder = self.folder;
|
||||
let path = if cfg!(windows) {
|
||||
format!("{folder}/scripts/{binary}.exe")
|
||||
} else {
|
||||
format!("{folder}/bin/{binary}")
|
||||
};
|
||||
vec![path]
|
||||
};
|
||||
|
||||
build.add_inputs("python_binary", self.python_binary);
|
||||
build.add_inputs("base_requirements", &self.base_requirements_txt);
|
||||
build.add_inputs("requirements", &self.requirements_txt);
|
||||
build.add_variable("pyenv_folder", self.folder);
|
||||
build.add_outputs_ext("bin", bin_path("python"), true);
|
||||
build.add_outputs_ext("pip", bin_path("pip"), true);
|
||||
for binary in self.extra_binary_exports {
|
||||
build.add_outputs_ext(*binary, bin_path(binary), true);
|
||||
}
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
pub struct PythonTypecheck {
|
||||
pub folders: &'static [&'static str],
|
||||
pub deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for PythonTypecheck {
|
||||
fn command(&self) -> &str {
|
||||
"$mypy $folders"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_inputs("mypy", inputs![":pyenv:mypy"]);
|
||||
build.add_inputs("", inputs![".mypy.ini"]);
|
||||
build.add_variable("folders", self.folders.join(" "));
|
||||
|
||||
let hash = simple_hash(self.folders);
|
||||
build.add_output_stamp(format!("tests/python_typecheck.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
struct PythonFormat<'a> {
|
||||
pub inputs: &'a BuildInput,
|
||||
pub check_only: bool,
|
||||
pub isort_ini: &'a BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for PythonFormat<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$black -t py39 -q $check --color $in && $
|
||||
$isort --color --settings-path $isort_ini $check $in"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("in", self.inputs);
|
||||
build.add_inputs("black", inputs![":pyenv:black"]);
|
||||
build.add_inputs("isort", inputs![":pyenv:isort"]);
|
||||
|
||||
let hash = simple_hash(self.inputs);
|
||||
build.add_env_var("BLACK_CACHE_DIR", "out/python/black.cache.{hash}");
|
||||
build.add_inputs("isort_ini", self.isort_ini);
|
||||
build.add_variable(
|
||||
"check",
|
||||
if self.check_only {
|
||||
"--diff --check"
|
||||
} else {
|
||||
""
|
||||
},
|
||||
);
|
||||
|
||||
build.add_output_stamp(format!(
|
||||
"tests/python_format.{}.{hash}",
|
||||
if self.check_only { "check" } else { "fix" }
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
pub fn python_format(build: &mut Build, group: &str, inputs: BuildInput) -> Result<()> {
|
||||
let isort_ini = &inputs![".isort.cfg"];
|
||||
build.add(
|
||||
&format!("check:format:python:{group}"),
|
||||
PythonFormat {
|
||||
inputs: &inputs,
|
||||
check_only: true,
|
||||
isort_ini,
|
||||
},
|
||||
)?;
|
||||
|
||||
build.add(
|
||||
&format!("format:python:{group}"),
|
||||
PythonFormat {
|
||||
inputs: &inputs,
|
||||
check_only: false,
|
||||
isort_ini,
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub struct PythonLint {
|
||||
pub folders: &'static [&'static str],
|
||||
pub pylint_ini: BuildInput,
|
||||
pub deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for PythonLint {
|
||||
fn command(&self) -> &str {
|
||||
"$pylint --rcfile $pylint_ini -sn -j $cpus $folders"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_inputs("pylint", inputs![":pyenv:pylint"]);
|
||||
build.add_inputs("pylint_ini", &self.pylint_ini);
|
||||
build.add_variable("folders", self.folders.join(" "));
|
||||
// On a 16 core system, values above 10 do not improve wall clock time,
|
||||
// but waste extra cores that could be working on other tests.
|
||||
build.add_variable("cpus", num_cpus::get().min(10).to_string());
|
||||
|
||||
let hash = simple_hash(&self.deps);
|
||||
build.add_output_stamp(format!("tests/python_lint.{hash}"));
|
||||
}
|
||||
}
|
||||
|
||||
pub struct PythonTest {
|
||||
pub folder: &'static str,
|
||||
pub python_path: &'static [&'static str],
|
||||
pub deps: BuildInput,
|
||||
}
|
||||
|
||||
impl BuildAction for PythonTest {
|
||||
fn command(&self) -> &str {
|
||||
"$pytest -p no:cacheprovider $folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_inputs("pytest", inputs![":pyenv:pytest"]);
|
||||
build.add_variable("folder", self.folder);
|
||||
build.add_variable(
|
||||
"pythonpath",
|
||||
&self.python_path.join(if cfg!(windows) { ";" } else { ":" }),
|
||||
);
|
||||
build.add_env_var("PYTHONPATH", "$pythonpath");
|
||||
build.add_env_var("ANKI_TEST_MODE", "1");
|
||||
let hash = simple_hash(self.folder);
|
||||
build.add_output_stamp(format!("tests/python_pytest.{hash}"));
|
||||
}
|
||||
}
|
67
build/ninja_gen/src/render.rs
Normal file
67
build/ninja_gen/src/render.rs
Normal file
@ -0,0 +1,67 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{fmt::Write, fs::read_to_string};
|
||||
|
||||
use crate::{archives::with_exe, input::space_separated, Build};
|
||||
|
||||
impl Build {
|
||||
pub fn render(&self) -> String {
|
||||
let mut buf = String::new();
|
||||
|
||||
writeln!(
|
||||
&mut buf,
|
||||
"# This file is automatically generated by configure.rs. Any edits will be lost.\n"
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
writeln!(&mut buf, "builddir = {}", self.buildroot.as_str()).unwrap();
|
||||
writeln!(
|
||||
&mut buf,
|
||||
"runner = $builddir/rust/debug/{}",
|
||||
with_exe("runner")
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
for (key, value) in &self.variables {
|
||||
writeln!(&mut buf, "{} = {}", key, value).unwrap();
|
||||
}
|
||||
buf.push('\n');
|
||||
|
||||
for (key, value) in &self.pools {
|
||||
writeln!(&mut buf, "pool {}\n depth = {}", key, value).unwrap();
|
||||
}
|
||||
buf.push('\n');
|
||||
|
||||
buf.push_str(&self.output_text);
|
||||
|
||||
for (group, targets) in &self.groups {
|
||||
let group = group.replace(':', "_");
|
||||
writeln!(
|
||||
&mut buf,
|
||||
"build {group}: phony {}",
|
||||
space_separated(targets)
|
||||
)
|
||||
.unwrap();
|
||||
buf.push('\n');
|
||||
}
|
||||
|
||||
buf.push_str(&self.trailing_text);
|
||||
|
||||
buf
|
||||
}
|
||||
|
||||
pub fn write_build_file(&self) {
|
||||
let existing_contents = read_to_string("build.ninja").unwrap_or_default();
|
||||
let new_contents = self.render();
|
||||
if existing_contents != new_contents {
|
||||
let folder = &self.buildroot;
|
||||
if !folder.exists() {
|
||||
std::fs::create_dir_all(folder).expect("create build dir");
|
||||
}
|
||||
std::fs::write(folder.join("build.ninja"), new_contents).expect("write build.ninja");
|
||||
}
|
||||
|
||||
// dbg!(&self.groups);
|
||||
}
|
||||
}
|
70
build/ninja_gen/src/rsync.rs
Normal file
70
build/ninja_gen/src/rsync.rs
Normal file
@ -0,0 +1,70 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use camino::Utf8Path;
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
build::FilesHandle,
|
||||
input::{space_separated, BuildInput},
|
||||
};
|
||||
|
||||
/// Rsync the provided inputs into `output_folder`, preserving directory structure,
|
||||
/// eg foo/bar.js -> out/$target_folder/foo/bar.js. `strip_prefix` can be used to
|
||||
/// remove a portion of the the path when copying. If the input files are from previous
|
||||
/// build outputs, the prefix should begin with `$builddir/`.
|
||||
pub struct RsyncFiles<'a> {
|
||||
pub inputs: BuildInput,
|
||||
pub target_folder: &'a str,
|
||||
pub strip_prefix: &'static str,
|
||||
pub extra_args: &'a str,
|
||||
}
|
||||
|
||||
impl BuildAction for RsyncFiles<'_> {
|
||||
fn command(&self) -> &str {
|
||||
"$runner rsync $extra_args --prefix $stripped_prefix --inputs $inputs_without_prefix --output-dir $builddir/$output_folder"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl FilesHandle) {
|
||||
let inputs = build.expand_inputs(&self.inputs);
|
||||
build.add_inputs_vec("", inputs.clone());
|
||||
let output_folder = Utf8Path::new(self.target_folder);
|
||||
let (prefix, inputs_without_prefix) = if self.strip_prefix.is_empty() {
|
||||
(".", inputs)
|
||||
} else {
|
||||
let stripped_inputs = inputs
|
||||
.iter()
|
||||
.map(|p| {
|
||||
Utf8Path::new(p)
|
||||
.strip_prefix(self.strip_prefix)
|
||||
.unwrap_or_else(|_| {
|
||||
panic!("expected {} to start with {}", p, self.strip_prefix)
|
||||
})
|
||||
.to_string()
|
||||
})
|
||||
.collect();
|
||||
(self.strip_prefix, stripped_inputs)
|
||||
};
|
||||
build.add_variable(
|
||||
"inputs_without_prefix",
|
||||
space_separated(&inputs_without_prefix),
|
||||
);
|
||||
build.add_variable("stripped_prefix", prefix);
|
||||
build.add_variable("output_folder", self.target_folder);
|
||||
if !self.extra_args.is_empty() {
|
||||
build.add_variable(
|
||||
"extra_args",
|
||||
format!("--extra-args {}", self.extra_args.replace(' ', ",")),
|
||||
);
|
||||
}
|
||||
|
||||
let outputs = inputs_without_prefix
|
||||
.iter()
|
||||
.map(|p| output_folder.join(p).to_string());
|
||||
build.add_outputs("", outputs);
|
||||
}
|
||||
|
||||
fn check_output_timestamps(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
43
build/ninja_gen/src/sass.rs
Normal file
43
build/ninja_gen/src/sass.rs
Normal file
@ -0,0 +1,43 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use crate::{
|
||||
action::BuildAction,
|
||||
cargo::CargoInstall,
|
||||
input::{space_separated, BuildInput},
|
||||
inputs, Build, Result,
|
||||
};
|
||||
|
||||
pub struct CompileSassWithGrass {
|
||||
pub input: BuildInput,
|
||||
pub output: &'static str,
|
||||
pub deps: BuildInput,
|
||||
pub load_paths: Vec<&'static str>,
|
||||
}
|
||||
|
||||
impl BuildAction for CompileSassWithGrass {
|
||||
fn command(&self) -> &str {
|
||||
"$grass $args -s compressed $in -- $out"
|
||||
}
|
||||
|
||||
fn files(&mut self, build: &mut impl crate::build::FilesHandle) {
|
||||
let args = space_separated(self.load_paths.iter().map(|path| format!("-I {path}")));
|
||||
|
||||
build.add_inputs("grass", inputs![":grass"]);
|
||||
build.add_inputs("in", &self.input);
|
||||
build.add_inputs("", &self.deps);
|
||||
build.add_variable("args", args);
|
||||
build.add_outputs("out", vec![self.output]);
|
||||
}
|
||||
|
||||
fn on_first_instance(&self, build: &mut Build) -> Result<()> {
|
||||
build.add(
|
||||
"grass",
|
||||
CargoInstall {
|
||||
binary_name: "grass",
|
||||
args: "grass --version 0.11.2",
|
||||
},
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
15
build/runner/Cargo.toml
Normal file
15
build/runner/Cargo.toml
Normal file
@ -0,0 +1,15 @@
|
||||
[package]
|
||||
name = "runner"
|
||||
|
||||
version.workspace = true
|
||||
authors.workspace = true
|
||||
license.workspace = true
|
||||
edition.workspace = true
|
||||
rust-version.workspace = true
|
||||
|
||||
[dependencies]
|
||||
camino = "1.1.1"
|
||||
clap = { version = "4.0.25", features = ["derive"] }
|
||||
junction = "0.2.0"
|
||||
termcolor = "1.1.3"
|
||||
workspace-hack = { version = "0.1", path = "../../tools/workspace-hack" }
|
13
build/runner/build.rs
Normal file
13
build/runner/build.rs
Normal file
@ -0,0 +1,13 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
fn main() {
|
||||
println!(
|
||||
"cargo:rustc-env=TARGET={}",
|
||||
if std::env::var("MAC_X86").is_ok() {
|
||||
"x86_64-apple-darwin".into()
|
||||
} else {
|
||||
std::env::var("TARGET").unwrap()
|
||||
}
|
||||
);
|
||||
}
|
173
build/runner/src/build.rs
Normal file
173
build/runner/src/build.rs
Normal file
@ -0,0 +1,173 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{env, fs, io::Write, process::Command};
|
||||
|
||||
use camino::Utf8Path;
|
||||
use clap::Args;
|
||||
use termcolor::{Color, ColorChoice, ColorSpec, StandardStream, WriteColor};
|
||||
|
||||
#[derive(Args)]
|
||||
pub struct BuildArgs {
|
||||
#[arg(trailing_var_arg = true)]
|
||||
args: Vec<String>,
|
||||
}
|
||||
|
||||
pub fn run_build(args: BuildArgs) {
|
||||
let build_root = setup_build_root();
|
||||
|
||||
let path = if cfg!(windows) {
|
||||
format!(
|
||||
"out\\bin;out\\extracted\\node;{};\\msys64\\usr\\bin",
|
||||
env::var("PATH").unwrap()
|
||||
)
|
||||
} else {
|
||||
format!(
|
||||
"out/bin:out/extracted/node/bin:{}",
|
||||
env::var("PATH").unwrap()
|
||||
)
|
||||
};
|
||||
|
||||
maybe_update_env_file(build_root);
|
||||
maybe_update_buildhash(build_root, &path);
|
||||
|
||||
// Ensure build file is up to date
|
||||
let build_file = build_root.join("build.ninja");
|
||||
if !build_file.exists() {
|
||||
bootstrap_build();
|
||||
} else {
|
||||
maybe_reconfigure_build(&build_file, &path);
|
||||
}
|
||||
|
||||
// automatically convert foo:bar references to foo_bar, as Ninja can not represent the former
|
||||
let ninja_args = args.args.into_iter().map(|a| a.replace(':', "_"));
|
||||
|
||||
let mut command = Command::new("ninja");
|
||||
command
|
||||
.arg("-f")
|
||||
.arg(&build_file)
|
||||
.args(ninja_args)
|
||||
.env("NINJA_STATUS", "[%f/%t; %r active; %es] ")
|
||||
.env("PATH", path)
|
||||
.env(
|
||||
"MYPY_CACHE_DIR",
|
||||
build_root.join("tests").join("mypy").into_string(),
|
||||
)
|
||||
.env("PYTHONPYCACHEPREFIX", build_root.join("pycache"))
|
||||
// commands will not show colors by default, as we do not provide a tty
|
||||
.env("FORCE_COLOR", "1")
|
||||
.env("MYPY_FORCE_COLOR", "1")
|
||||
.env("TERM", "1")
|
||||
// Prevents 'Warn: You must provide the URL of lib/mappings.wasm'.
|
||||
// Updating svelte-check or its deps will likely remove the need for it.
|
||||
.env("NODE_OPTIONS", "--no-experimental-fetch");
|
||||
|
||||
// run build
|
||||
let status = command.status().expect("ninja not installed");
|
||||
let mut stdout = StandardStream::stdout(ColorChoice::Always);
|
||||
if status.success() {
|
||||
stdout
|
||||
.set_color(ColorSpec::new().set_fg(Some(Color::Green)).set_bold(true))
|
||||
.unwrap();
|
||||
writeln!(&mut stdout, "\nBuild succeeded.").unwrap();
|
||||
stdout.reset().unwrap();
|
||||
} else {
|
||||
stdout
|
||||
.set_color(ColorSpec::new().set_fg(Some(Color::Red)).set_bold(true))
|
||||
.unwrap();
|
||||
writeln!(&mut stdout, "\nBuild failed.").unwrap();
|
||||
stdout.reset().unwrap();
|
||||
|
||||
// One cause of build failures is when a source file that was included in a glob is
|
||||
// removed. Automatically reconfigure on next run so this situation resolves itself.
|
||||
fs::remove_file(build_file).expect("build file removal");
|
||||
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
fn setup_build_root() -> &'static Utf8Path {
|
||||
let build_root = Utf8Path::new("out");
|
||||
|
||||
#[cfg(unix)]
|
||||
if let Ok(new_target) = env::var("BUILD_ROOT").map(camino::Utf8PathBuf::from) {
|
||||
let create = if let Ok(existing_target) = build_root.read_link_utf8() {
|
||||
if existing_target != new_target {
|
||||
fs::remove_file(build_root).unwrap();
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
true
|
||||
};
|
||||
if create {
|
||||
println!("Switching build root to {}", new_target);
|
||||
std::os::unix::fs::symlink(new_target, build_root).unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
fs::create_dir_all(build_root).unwrap();
|
||||
|
||||
build_root
|
||||
}
|
||||
|
||||
fn maybe_reconfigure_build(build_file: &Utf8Path, path: &str) {
|
||||
let output = Command::new("ninja")
|
||||
.arg("-f")
|
||||
.arg(build_file)
|
||||
.arg("build_run_configure")
|
||||
.env("PATH", path)
|
||||
.output()
|
||||
.expect("ninja installed");
|
||||
if !output.status.success() {
|
||||
// The existing build.ninja may be invalid if files have been renamed/removed;
|
||||
// resort to a slower cargo invocation instead to regenerate it.
|
||||
bootstrap_build();
|
||||
}
|
||||
}
|
||||
|
||||
fn bootstrap_build() {
|
||||
let status = Command::new("cargo")
|
||||
.args(["run", "-p", "configure"])
|
||||
.status();
|
||||
assert!(status.expect("ninja").success());
|
||||
}
|
||||
|
||||
fn maybe_update_buildhash(build_root: &Utf8Path, path_env: &str) {
|
||||
// only updated on release builds
|
||||
let path = build_root.join("buildhash");
|
||||
if env::var("RELEASE").is_ok() || !path.exists() {
|
||||
write_if_changed(&path, &get_buildhash(path_env))
|
||||
}
|
||||
}
|
||||
|
||||
fn get_buildhash(path: &str) -> String {
|
||||
let output = Command::new("git")
|
||||
.args(["rev-parse", "--short=8", "HEAD"])
|
||||
.env("PATH", path)
|
||||
.output()
|
||||
.expect("git");
|
||||
assert!(output.status.success(), "git failed");
|
||||
String::from_utf8(output.stdout).unwrap().trim().into()
|
||||
}
|
||||
|
||||
fn write_if_changed(path: &Utf8Path, contents: &str) {
|
||||
if let Ok(old_contents) = fs::read_to_string(path) {
|
||||
if old_contents == contents {
|
||||
return;
|
||||
}
|
||||
}
|
||||
fs::write(path, contents).unwrap();
|
||||
}
|
||||
|
||||
/// Trigger reconfigure when our env vars change
|
||||
fn maybe_update_env_file(build_root: &Utf8Path) {
|
||||
let env_file = build_root.join("env");
|
||||
let build_root_env = env::var("BUILD_ROOT").unwrap_or_default();
|
||||
let release = env::var("RELEASE").unwrap_or_default();
|
||||
let other_watched_env = env::var("RECONFIGURE_KEY").unwrap_or_default();
|
||||
let current_env = format!("{build_root_env};{release};{other_watched_env}");
|
||||
|
||||
write_if_changed(&env_file, ¤t_env);
|
||||
}
|
55
build/runner/src/bundle/artifacts.rs
Normal file
55
build/runner/src/bundle/artifacts.rs
Normal file
@ -0,0 +1,55 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{env, fs, process::Command};
|
||||
|
||||
use camino::Utf8PathBuf;
|
||||
use clap::Args;
|
||||
|
||||
use crate::run::run_silent;
|
||||
|
||||
#[derive(Args, Debug)]
|
||||
pub struct BuildArtifactsArgs {
|
||||
bundle_root: Utf8PathBuf,
|
||||
pyoxidizer_bin: String,
|
||||
}
|
||||
|
||||
pub fn build_artifacts(args: BuildArtifactsArgs) {
|
||||
// build.rs doesn't declare inputs from venv, so we need to force a rebuild to ensure
|
||||
// changes to our libs/the venv get included
|
||||
let artifacts = args.bundle_root.join("artifacts");
|
||||
if artifacts.exists() {
|
||||
fs::remove_dir_all(&artifacts).unwrap();
|
||||
}
|
||||
let bundle_root = args.bundle_root.canonicalize_utf8().unwrap();
|
||||
|
||||
run_silent(
|
||||
Command::new(&args.pyoxidizer_bin)
|
||||
.args([
|
||||
"--system-rust",
|
||||
"run-build-script",
|
||||
"qt/bundle/build.rs",
|
||||
"--var",
|
||||
"venv",
|
||||
"out/bundle/pyenv",
|
||||
"--var",
|
||||
"build",
|
||||
bundle_root.join("build").as_str(),
|
||||
])
|
||||
.env("CARGO_MANIFEST_DIR", "qt/bundle")
|
||||
.env("CARGO_TARGET_DIR", "out/bundle/rust")
|
||||
.env("PROFILE", "release")
|
||||
.env("OUT_DIR", &artifacts)
|
||||
.env("TARGET", env!("TARGET"))
|
||||
.env("MACOSX_DEPLOYMENT_TARGET", macos_deployment_target())
|
||||
.env("CARGO_BUILD_TARGET", env!("TARGET")),
|
||||
);
|
||||
}
|
||||
|
||||
pub fn macos_deployment_target() -> &'static str {
|
||||
if env!("TARGET") == "x86_64-apple-darwin" {
|
||||
"10.13.4"
|
||||
} else {
|
||||
"11"
|
||||
}
|
||||
}
|
41
build/runner/src/bundle/binary.rs
Normal file
41
build/runner/src/bundle/binary.rs
Normal file
@ -0,0 +1,41 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::process::Command;
|
||||
|
||||
use camino::Utf8Path;
|
||||
|
||||
use super::artifacts::macos_deployment_target;
|
||||
use crate::run::run_silent;
|
||||
|
||||
pub fn build_bundle_binary() {
|
||||
let mut features = String::from("build-mode-prebuilt-artifacts");
|
||||
if cfg!(target_os = "linux") || (cfg!(target_os = "macos") && cfg!(target_arch = "aarch64")) {
|
||||
features.push_str(",global-allocator-jemalloc,allocator-jemalloc");
|
||||
}
|
||||
|
||||
let mut command = Command::new("cargo");
|
||||
command
|
||||
.args([
|
||||
"build",
|
||||
"--manifest-path=qt/bundle/Cargo.toml",
|
||||
"--target-dir=out/bundle/rust",
|
||||
"--release",
|
||||
"--no-default-features",
|
||||
])
|
||||
.arg(format!("--features={features}"))
|
||||
.env(
|
||||
"DEFAULT_PYTHON_CONFIG_RS",
|
||||
// included in main.rs, so relative to qt/bundle/src
|
||||
"../../../out/bundle/artifacts/",
|
||||
)
|
||||
.env(
|
||||
"PYO3_CONFIG_FILE",
|
||||
Utf8Path::new("out/bundle/artifacts/pyo3-build-config-file.txt")
|
||||
.canonicalize_utf8()
|
||||
.unwrap(),
|
||||
)
|
||||
.env("MACOSX_DEPLOYMENT_TARGET", macos_deployment_target())
|
||||
.env("CARGO_BUILD_TARGET", env!("TARGET"));
|
||||
run_silent(&mut command);
|
||||
}
|
143
build/runner/src/bundle/folder.rs
Normal file
143
build/runner/src/bundle/folder.rs
Normal file
@ -0,0 +1,143 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{env, fs, process::Command};
|
||||
|
||||
use camino::{Utf8Path, Utf8PathBuf};
|
||||
use clap::{Args, ValueEnum};
|
||||
|
||||
use crate::{
|
||||
paths::{absolute_msys_path, unix_path},
|
||||
run::run_silent,
|
||||
};
|
||||
|
||||
#[derive(Clone, Copy, ValueEnum, Debug)]
|
||||
enum DistKind {
|
||||
Standard,
|
||||
Alternate,
|
||||
}
|
||||
|
||||
#[derive(Args, Debug)]
|
||||
pub struct BuildDistFolderArgs {
|
||||
kind: DistKind,
|
||||
folder_root: Utf8PathBuf,
|
||||
}
|
||||
|
||||
pub fn build_dist_folder(args: BuildDistFolderArgs) {
|
||||
let BuildDistFolderArgs { kind, folder_root } = args;
|
||||
fs::create_dir_all(&folder_root).unwrap();
|
||||
// Start with Qt, as it's the largest, and we use --delete to ensure there are no
|
||||
// stale files in lib/. Skipped on macOS as Qt is handled later.
|
||||
if !cfg!(target_os = "macos") {
|
||||
copy_qt_from_venv(kind, &folder_root);
|
||||
}
|
||||
clean_top_level_files(&folder_root);
|
||||
copy_binary_and_pylibs(&folder_root);
|
||||
if cfg!(target_os = "linux") {
|
||||
copy_linux_extras(kind, &folder_root);
|
||||
} else if cfg!(windows) {
|
||||
copy_windows_extras(&folder_root);
|
||||
}
|
||||
fs::write(folder_root.with_extension("stamp"), b"").unwrap();
|
||||
}
|
||||
|
||||
fn copy_qt_from_venv(kind: DistKind, folder_root: &Utf8Path) {
|
||||
let python39 = if cfg!(windows) { "" } else { "python3.9/" };
|
||||
let qt_root = match kind {
|
||||
DistKind::Standard => {
|
||||
folder_root.join(format!("../pyenv/lib/{python39}site-packages/PyQt6"))
|
||||
}
|
||||
DistKind::Alternate => {
|
||||
folder_root.join(format!("../pyenv-qt5/lib/{python39}site-packages/PyQt5"))
|
||||
}
|
||||
};
|
||||
let src_path = absolute_msys_path(&qt_root);
|
||||
let lib_path = folder_root.join("lib");
|
||||
fs::create_dir_all(&lib_path).unwrap();
|
||||
let dst_path = with_slash(absolute_msys_path(&lib_path));
|
||||
run_silent(Command::new("rsync").args([
|
||||
"-a",
|
||||
"--delete",
|
||||
"--exclude-from",
|
||||
"qt/bundle/qt.exclude",
|
||||
&src_path,
|
||||
&dst_path,
|
||||
]));
|
||||
}
|
||||
|
||||
fn copy_linux_extras(kind: DistKind, folder_root: &Utf8Path) {
|
||||
// add README, installer, etc
|
||||
run_silent(Command::new("rsync").args(["-a", "qt/bundle/lin/", &with_slash(folder_root)]));
|
||||
|
||||
// add extra IME plugins from download
|
||||
let lib_path = folder_root.join("lib");
|
||||
let src_path = folder_root
|
||||
.join("../../extracted/linux_qt_plugins")
|
||||
.join(match kind {
|
||||
DistKind::Standard => "qt6",
|
||||
DistKind::Alternate => "qt5",
|
||||
});
|
||||
let dst_path = lib_path.join(match kind {
|
||||
DistKind::Standard => "PyQt6/Qt6/plugins",
|
||||
DistKind::Alternate => "PyQt5/Qt5/plugins",
|
||||
});
|
||||
run_silent(Command::new("rsync").args(["-a", &with_slash(src_path), &with_slash(dst_path)]));
|
||||
}
|
||||
|
||||
fn copy_windows_extras(folder_root: &Utf8Path) {
|
||||
run_silent(Command::new("rsync").args([
|
||||
"-a",
|
||||
"out/extracted/win_amd64_audio/",
|
||||
&with_slash(folder_root),
|
||||
]));
|
||||
}
|
||||
|
||||
fn clean_top_level_files(folder_root: &Utf8Path) {
|
||||
let mut to_remove = vec![];
|
||||
for entry in std::fs::read_dir(folder_root).unwrap() {
|
||||
let entry = entry.unwrap();
|
||||
if entry.file_name() == "lib" {
|
||||
continue;
|
||||
} else {
|
||||
to_remove.push(entry.path());
|
||||
}
|
||||
}
|
||||
for path in to_remove {
|
||||
if path.is_dir() {
|
||||
std::fs::remove_dir_all(path).unwrap()
|
||||
} else {
|
||||
std::fs::remove_file(path).unwrap()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn with_slash<P>(path: P) -> String
|
||||
where
|
||||
P: AsRef<str>,
|
||||
{
|
||||
format!("{}/", path.as_ref())
|
||||
}
|
||||
|
||||
fn copy_binary_and_pylibs(folder_root: &Utf8Path) {
|
||||
let binary = folder_root
|
||||
.join("../rust")
|
||||
.join(env!("TARGET"))
|
||||
.join("release")
|
||||
.join(if cfg!(windows) { "anki.exe" } else { "anki" });
|
||||
let extra_files = folder_root
|
||||
.join("../build")
|
||||
.join(env!("TARGET"))
|
||||
.join("release/resources/extra_files");
|
||||
run_silent(Command::new("rsync").args([
|
||||
"-a",
|
||||
"--exclude",
|
||||
"PyQt6",
|
||||
// misleading, as it misses the GPL PyQt, and our Rust/JS
|
||||
// dependencies
|
||||
"--exclude",
|
||||
"COPYING.txt",
|
||||
&unix_path(&binary),
|
||||
&with_slash(unix_path(&extra_files)),
|
||||
&with_slash(unix_path(folder_root)),
|
||||
]));
|
||||
}
|
6
build/runner/src/bundle/mod.rs
Normal file
6
build/runner/src/bundle/mod.rs
Normal file
@ -0,0 +1,6 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
pub mod artifacts;
|
||||
pub mod binary;
|
||||
pub mod folder;
|
62
build/runner/src/main.rs
Normal file
62
build/runner/src/main.rs
Normal file
@ -0,0 +1,62 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
//! A helper for invoking one or more commands in a cross-platform way,
|
||||
//! silencing their output when they succeed. Most build actions implicitly use
|
||||
//! the 'run' command; we define separate commands for more complicated actions.
|
||||
|
||||
mod build;
|
||||
mod bundle;
|
||||
mod paths;
|
||||
mod pyenv;
|
||||
mod rsync;
|
||||
mod run;
|
||||
mod yarn;
|
||||
|
||||
use std::error::Error;
|
||||
|
||||
use build::{run_build, BuildArgs};
|
||||
use bundle::{
|
||||
artifacts::{build_artifacts, BuildArtifactsArgs},
|
||||
binary::build_bundle_binary,
|
||||
folder::{build_dist_folder, BuildDistFolderArgs},
|
||||
};
|
||||
// use bundle::{build_bundle_binary, build_dist_folder, BuildDistFolderArgs};
|
||||
use clap::{Parser, Subcommand};
|
||||
use pyenv::{setup_pyenv, PyenvArgs};
|
||||
use rsync::{rsync_files, RsyncArgs};
|
||||
use run::{run_commands, RunArgs};
|
||||
use yarn::{setup_yarn, YarnArgs};
|
||||
|
||||
pub type Result<T, E = Box<dyn Error>> = std::result::Result<T, E>;
|
||||
|
||||
#[derive(Parser)]
|
||||
struct Cli {
|
||||
#[command(subcommand)]
|
||||
command: Command,
|
||||
}
|
||||
|
||||
#[derive(Subcommand)]
|
||||
enum Command {
|
||||
Pyenv(PyenvArgs),
|
||||
Yarn(YarnArgs),
|
||||
Rsync(RsyncArgs),
|
||||
Run(RunArgs),
|
||||
Build(BuildArgs),
|
||||
BuildArtifacts(BuildArtifactsArgs),
|
||||
BuildBundleBinary,
|
||||
BuildDistFolder(BuildDistFolderArgs),
|
||||
}
|
||||
|
||||
fn main() {
|
||||
match Cli::parse().command {
|
||||
Command::Pyenv(args) => setup_pyenv(args),
|
||||
Command::Run(args) => run_commands(args),
|
||||
Command::Rsync(args) => rsync_files(args),
|
||||
Command::Yarn(args) => setup_yarn(args),
|
||||
Command::Build(args) => run_build(args),
|
||||
Command::BuildArtifacts(args) => build_artifacts(args),
|
||||
Command::BuildBundleBinary => build_bundle_binary(),
|
||||
Command::BuildDistFolder(args) => build_dist_folder(args),
|
||||
};
|
||||
}
|
23
build/runner/src/paths.rs
Normal file
23
build/runner/src/paths.rs
Normal file
@ -0,0 +1,23 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use camino::Utf8Path;
|
||||
|
||||
/// On Unix, just a normal path. On Windows, c:\foo\bar.txt becomes /c/foo/bar.txt,
|
||||
/// which msys rsync expects.
|
||||
pub fn absolute_msys_path(path: &Utf8Path) -> String {
|
||||
let path = path.canonicalize_utf8().unwrap().into_string();
|
||||
if !cfg!(windows) {
|
||||
return path;
|
||||
}
|
||||
|
||||
// strip off \\? verbatim prefix, which things like rsync/ninja choke on
|
||||
let drive = &path.chars().nth(4).unwrap();
|
||||
// and \ -> /
|
||||
format!("/{drive}/{}", path[7..].replace('\\', "/"))
|
||||
}
|
||||
|
||||
/// Converts backslashes to forward slashes
|
||||
pub fn unix_path(path: &Utf8Path) -> String {
|
||||
path.as_str().replace('\\', "/")
|
||||
}
|
56
build/runner/src/pyenv.rs
Normal file
56
build/runner/src/pyenv.rs
Normal file
@ -0,0 +1,56 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::process::Command;
|
||||
|
||||
use camino::Utf8Path;
|
||||
use clap::Args;
|
||||
|
||||
use crate::run::run_silent;
|
||||
|
||||
#[derive(Args)]
|
||||
pub struct PyenvArgs {
|
||||
python_bin: String,
|
||||
pyenv_folder: String,
|
||||
initial_reqs: String,
|
||||
reqs: Vec<String>,
|
||||
#[arg(long, allow_hyphen_values(true))]
|
||||
venv_args: Vec<String>,
|
||||
}
|
||||
|
||||
/// Set up a venv if one doesn't already exist, and then sync packages with provided requirements file.
|
||||
pub fn setup_pyenv(args: PyenvArgs) {
|
||||
let pyenv_folder = Utf8Path::new(&args.pyenv_folder);
|
||||
|
||||
let pyenv_bin_folder = pyenv_folder.join(if cfg!(windows) { "scripts" } else { "bin" });
|
||||
let pyenv_python = pyenv_bin_folder.join("python");
|
||||
let pip_sync = pyenv_bin_folder.join("pip-sync");
|
||||
|
||||
if !pyenv_python.exists() {
|
||||
run_silent(
|
||||
Command::new(&args.python_bin)
|
||||
.args(["-m", "venv"])
|
||||
.args(args.venv_args)
|
||||
.arg(pyenv_folder),
|
||||
);
|
||||
|
||||
if cfg!(windows) {
|
||||
// the first install on Windows throws an error the first time pip is upgraded, so we install
|
||||
// it twice and swallow the first error
|
||||
let _output = Command::new(&pyenv_python)
|
||||
.args(["-m", "pip", "install", "-r", &args.initial_reqs])
|
||||
.output()
|
||||
.unwrap();
|
||||
}
|
||||
|
||||
run_silent(Command::new(pyenv_python).args([
|
||||
"-m",
|
||||
"pip",
|
||||
"install",
|
||||
"-r",
|
||||
&args.initial_reqs,
|
||||
]));
|
||||
}
|
||||
|
||||
run_silent(Command::new(pip_sync).args(&args.reqs));
|
||||
}
|
39
build/runner/src/rsync.rs
Normal file
39
build/runner/src/rsync.rs
Normal file
@ -0,0 +1,39 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::process::Command;
|
||||
|
||||
use camino::Utf8Path;
|
||||
use clap::Args;
|
||||
|
||||
use crate::{paths::absolute_msys_path, run::run_silent};
|
||||
|
||||
#[derive(Args)]
|
||||
pub struct RsyncArgs {
|
||||
#[arg(long, value_delimiter(','), allow_hyphen_values(true))]
|
||||
extra_args: Vec<String>,
|
||||
#[arg(long)]
|
||||
prefix: String,
|
||||
#[arg(long, required(true), num_args(..))]
|
||||
inputs: Vec<String>,
|
||||
#[arg(long)]
|
||||
output_dir: String,
|
||||
}
|
||||
|
||||
pub fn rsync_files(args: RsyncArgs) {
|
||||
let output_dir = absolute_msys_path(Utf8Path::new(&args.output_dir));
|
||||
run_silent(
|
||||
Command::new("rsync")
|
||||
.current_dir(&args.prefix)
|
||||
.arg("--relative")
|
||||
.args(args.extra_args)
|
||||
.args(args.inputs.iter().map(|i| {
|
||||
if cfg!(windows) {
|
||||
i.replace('\\', "/")
|
||||
} else {
|
||||
i.clone()
|
||||
}
|
||||
}))
|
||||
.arg(output_dir),
|
||||
);
|
||||
}
|
90
build/runner/src/run.rs
Normal file
90
build/runner/src/run.rs
Normal file
@ -0,0 +1,90 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{
|
||||
io::ErrorKind,
|
||||
process::{Command, Output},
|
||||
};
|
||||
|
||||
use clap::Args;
|
||||
|
||||
#[derive(Args)]
|
||||
pub struct RunArgs {
|
||||
#[arg(long)]
|
||||
stamp: Option<String>,
|
||||
#[arg(long, value_parser = split_env)]
|
||||
env: Vec<(String, String)>,
|
||||
#[arg(trailing_var_arg = true)]
|
||||
args: Vec<String>,
|
||||
}
|
||||
|
||||
/// Run one or more commands separated by `&&`, optionally stamping or setting
|
||||
/// extra env vars.
|
||||
pub fn run_commands(args: RunArgs) {
|
||||
let commands = split_args(args.args);
|
||||
for command in commands {
|
||||
run_silent(&mut build_command(command, &args.env));
|
||||
}
|
||||
if let Some(stamp_file) = args.stamp {
|
||||
std::fs::write(stamp_file, b"").expect("unable to write stamp file");
|
||||
}
|
||||
}
|
||||
|
||||
fn split_env(s: &str) -> Result<(String, String), std::io::Error> {
|
||||
if let Some((k, v)) = s.split_once('=') {
|
||||
Ok((k.into(), v.into()))
|
||||
} else {
|
||||
Err(std::io::Error::new(ErrorKind::Other, "invalid env var"))
|
||||
}
|
||||
}
|
||||
|
||||
fn build_command(command_and_args: Vec<String>, env: &[(String, String)]) -> Command {
|
||||
let mut command = Command::new(&command_and_args[0]);
|
||||
command.args(&command_and_args[1..]);
|
||||
for (k, v) in env {
|
||||
command.env(k, v);
|
||||
}
|
||||
command
|
||||
}
|
||||
|
||||
/// If multiple commands have been provided separated by &&, split them up.
|
||||
fn split_args(args: Vec<String>) -> Vec<Vec<String>> {
|
||||
let mut commands = vec![];
|
||||
let mut current_command = vec![];
|
||||
for arg in args.into_iter() {
|
||||
if arg == "&&" {
|
||||
commands.push(current_command);
|
||||
current_command = vec![];
|
||||
} else {
|
||||
current_command.push(arg)
|
||||
}
|
||||
}
|
||||
if !current_command.is_empty() {
|
||||
commands.push(current_command)
|
||||
}
|
||||
commands
|
||||
}
|
||||
|
||||
/// Log stdout/stderr and exit if command failed; return output on success.
|
||||
/// If OUTPUT_SUCCESS=1 is defined, output will be shown on success.
|
||||
pub fn run_silent(command: &mut Command) -> Output {
|
||||
let output = command
|
||||
.output()
|
||||
.unwrap_or_else(|e| panic!("failed to run command: {:?}: {e}", command));
|
||||
if !output.status.success() {
|
||||
println!(
|
||||
"Command failed: \n{}\n{}",
|
||||
String::from_utf8_lossy(&output.stdout),
|
||||
String::from_utf8_lossy(&output.stderr),
|
||||
);
|
||||
std::process::exit(output.status.code().unwrap_or(1));
|
||||
}
|
||||
if std::env::var("OUTPUT_SUCCESS").is_ok() {
|
||||
println!(
|
||||
"{}{}",
|
||||
String::from_utf8_lossy(&output.stdout),
|
||||
String::from_utf8_lossy(&output.stderr)
|
||||
);
|
||||
}
|
||||
output
|
||||
}
|
50
build/runner/src/yarn.rs
Normal file
50
build/runner/src/yarn.rs
Normal file
@ -0,0 +1,50 @@
|
||||
// Copyright: Ankitects Pty Ltd and contributors
|
||||
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
||||
|
||||
use std::{path::Path, process::Command};
|
||||
|
||||
use clap::Args;
|
||||
|
||||
use crate::run::run_silent;
|
||||
|
||||
#[derive(Args)]
|
||||
pub struct YarnArgs {
|
||||
yarn_bin: String,
|
||||
stamp: String,
|
||||
}
|
||||
|
||||
pub fn setup_yarn(args: YarnArgs) {
|
||||
link_node_modules();
|
||||
|
||||
run_silent(Command::new(&args.yarn_bin).arg("install"));
|
||||
|
||||
std::fs::write(args.stamp, b"").unwrap();
|
||||
}
|
||||
|
||||
/// Unfortunately a lot of the node ecosystem expects the output folder to reside
|
||||
/// in the repo root, so we need to link in our output folder.
|
||||
#[cfg(not(windows))]
|
||||
fn link_node_modules() {
|
||||
let target = Path::new("node_modules");
|
||||
if target.exists() {
|
||||
if !target.is_symlink() {
|
||||
panic!("please remove the node_modules folder from the repo root");
|
||||
}
|
||||
} else {
|
||||
std::os::unix::fs::symlink("out/node_modules", target).unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
/// Things are more complicated on Windows - having $root/node_modules point to $root/out/node_modules
|
||||
/// breaks our globs for some reason, so we create the junction in the opposite direction instead.
|
||||
/// Ninja will have already created some empty folders based on our declared outputs, so we move the
|
||||
/// created folder into the root.
|
||||
#[cfg(windows)]
|
||||
fn link_node_modules() {
|
||||
let target = Path::new("out/node_modules");
|
||||
let source = Path::new("node_modules");
|
||||
if !source.exists() {
|
||||
std::fs::rename(target, source).unwrap();
|
||||
junction::create(source, target).unwrap()
|
||||
}
|
||||
}
|
@ -1,635 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze generated Bazel file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
package(default_visibility = ["//visibility:public"])
|
||||
|
||||
licenses([
|
||||
"notice", # See individual crates for specific licenses
|
||||
])
|
||||
|
||||
# Aliased targets
|
||||
alias(
|
||||
name = "ammonia",
|
||||
actual = "@raze__ammonia__3_2_1//:ammonia",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "async_trait",
|
||||
actual = "@raze__async_trait__0_1_57//:async_trait",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "blake3",
|
||||
actual = "@raze__blake3__1_3_1//:blake3",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "bytes",
|
||||
actual = "@raze__bytes__1_2_1//:bytes",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "chrono",
|
||||
actual = "@raze__chrono__0_4_22//:chrono",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "coarsetime",
|
||||
actual = "@raze__coarsetime__0_1_22//:coarsetime",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "convert_case",
|
||||
actual = "@raze__convert_case__0_6_0//:convert_case",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "csv",
|
||||
actual = "@raze__csv__1_1_6//:csv",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "dissimilar",
|
||||
actual = "@raze__dissimilar__1_0_4//:dissimilar",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "env_logger",
|
||||
actual = "@raze__env_logger__0_9_1//:env_logger",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "flate2",
|
||||
actual = "@raze__flate2__1_0_24//:flate2",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "fluent",
|
||||
actual = "@raze__fluent__0_16_0//:fluent",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "fluent_bundle",
|
||||
actual = "@raze__fluent_bundle__0_15_2//:fluent_bundle",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "fluent_syntax",
|
||||
actual = "@raze__fluent_syntax__0_11_0//:fluent_syntax",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "fnv",
|
||||
actual = "@raze__fnv__1_0_7//:fnv",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "futures",
|
||||
actual = "@raze__futures__0_3_24//:futures",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "hex",
|
||||
actual = "@raze__hex__0_4_3//:hex",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "htmlescape",
|
||||
actual = "@raze__htmlescape__0_3_1//:htmlescape",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "id_tree",
|
||||
actual = "@raze__id_tree__1_8_0//:id_tree",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "inflections",
|
||||
actual = "@raze__inflections__1_1_1//:inflections",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "intl_memoizer",
|
||||
actual = "@raze__intl_memoizer__0_5_1//:intl_memoizer",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "itertools",
|
||||
actual = "@raze__itertools__0_10_5//:itertools",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "lazy_static",
|
||||
actual = "@raze__lazy_static__1_4_0//:lazy_static",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "linkcheck",
|
||||
actual = "@raze__linkcheck__0_4_1_alpha_0//:linkcheck",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "nom",
|
||||
actual = "@raze__nom__7_1_1//:nom",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "num_cpus",
|
||||
actual = "@raze__num_cpus__1_13_1//:num_cpus",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "num_enum",
|
||||
actual = "@raze__num_enum__0_5_7//:num_enum",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "num_format",
|
||||
actual = "@raze__num_format__0_4_0//:num_format",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "num_integer",
|
||||
actual = "@raze__num_integer__0_1_45//:num_integer",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "once_cell",
|
||||
actual = "@raze__once_cell__1_15_0//:once_cell",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "pct_str",
|
||||
actual = "@raze__pct_str__1_1_0//:pct_str",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "phf",
|
||||
actual = "@raze__phf__0_11_1//:phf",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "pin_project",
|
||||
actual = "@raze__pin_project__1_0_12//:pin_project",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "proc_macro_nested",
|
||||
actual = "@raze__proc_macro_nested__0_1_7//:proc_macro_nested",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "prost",
|
||||
actual = "@raze__prost__0_11_0//:prost",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "prost_build",
|
||||
actual = "@raze__prost_build__0_11_1//:prost_build",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "pulldown_cmark",
|
||||
actual = "@raze__pulldown_cmark__0_9_2//:pulldown_cmark",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "pyo3",
|
||||
actual = "@raze__pyo3__0_17_1//:pyo3",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "rand",
|
||||
actual = "@raze__rand__0_8_5//:rand",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "regex",
|
||||
actual = "@raze__regex__1_6_0//:regex",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "reqwest",
|
||||
actual = "@raze__reqwest__0_11_3//:reqwest",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "rusqlite",
|
||||
actual = "@raze__rusqlite__0_28_0//:rusqlite",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "scopeguard",
|
||||
actual = "@raze__scopeguard__1_1_0//:scopeguard",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde",
|
||||
actual = "@raze__serde__1_0_145//:serde",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde_aux",
|
||||
actual = "@raze__serde_aux__4_0_0//:serde_aux",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde_derive",
|
||||
actual = "@raze__serde_derive__1_0_145//:serde_derive",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde_json",
|
||||
actual = "@raze__serde_json__1_0_85//:serde_json",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde_repr",
|
||||
actual = "@raze__serde_repr__0_1_9//:serde_repr",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "serde_tuple",
|
||||
actual = "@raze__serde_tuple__0_5_0//:serde_tuple",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "sha1",
|
||||
actual = "@raze__sha1__0_6_1//:sha1",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "slog",
|
||||
actual = "@raze__slog__2_7_0//:slog",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "slog_async",
|
||||
actual = "@raze__slog_async__2_7_0//:slog_async",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "slog_envlogger",
|
||||
actual = "@raze__slog_envlogger__2_2_0//:slog_envlogger",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "slog_term",
|
||||
actual = "@raze__slog_term__2_9_0//:slog_term",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "snafu",
|
||||
actual = "@raze__snafu__0_7_2//:snafu",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "strum",
|
||||
actual = "@raze__strum__0_24_1//:strum",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "tempfile",
|
||||
actual = "@raze__tempfile__3_3_0//:tempfile",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "tokio",
|
||||
actual = "@raze__tokio__1_21_1//:tokio",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "tokio_util",
|
||||
actual = "@raze__tokio_util__0_7_4//:tokio_util",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "unic_langid",
|
||||
actual = "@raze__unic_langid__0_9_0//:unic_langid",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "unic_ucd_category",
|
||||
actual = "@raze__unic_ucd_category__0_9_0//:unic_ucd_category",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "unicase",
|
||||
actual = "@raze__unicase__2_6_0//:unicase",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "unicode_normalization",
|
||||
actual = "@raze__unicode_normalization__0_1_22//:unicode_normalization",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "utime",
|
||||
actual = "@raze__utime__0_3_1//:utime",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "walkdir",
|
||||
actual = "@raze__walkdir__2_3_2//:walkdir",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "which",
|
||||
actual = "@raze__which__4_3_0//:which",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "zip",
|
||||
actual = "@raze__zip__0_6_2//:zip",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
alias(
|
||||
name = "zstd",
|
||||
actual = "@raze__zstd__0_11_2_zstd_1_5_2//:zstd",
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
)
|
||||
|
||||
# Export file for Stardoc support
|
||||
exports_files(
|
||||
[
|
||||
"crates.bzl",
|
||||
],
|
||||
visibility = ["//visibility:public"],
|
||||
)
|
||||
|
||||
exports_files(["licenses.json"])
|
@ -1,149 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "blocking" with type "example" omitted
|
||||
|
||||
# Unsupported target "form" with type "example" omitted
|
||||
|
||||
# Unsupported target "json_dynamic" with type "example" omitted
|
||||
|
||||
# Unsupported target "json_typed" with type "example" omitted
|
||||
|
||||
# Unsupported target "simple" with type "example" omitted
|
||||
|
||||
# Unsupported target "tor_socks" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "reqwest",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
aliases = {
|
||||
"@raze__native_tls__0_2_10//:native_tls": "native_tls_crate",
|
||||
},
|
||||
crate_features = [
|
||||
"__tls",
|
||||
"default-tls",
|
||||
"hyper-tls",
|
||||
"json",
|
||||
"mime_guess",
|
||||
"multipart",
|
||||
"native-tls",
|
||||
"native-tls-crate",
|
||||
"serde_json",
|
||||
"socks",
|
||||
"stream",
|
||||
"tokio-native-tls",
|
||||
"tokio-socks",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.11.3",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__bytes__1_2_1//:bytes",
|
||||
"@raze__http__0_2_8//:http",
|
||||
"@raze__hyper_timeout__0_4_1//:hyper_timeout",
|
||||
"@raze__mime_guess__2_0_4//:mime_guess",
|
||||
"@raze__serde__1_0_145//:serde",
|
||||
"@raze__serde_json__1_0_85//:serde_json",
|
||||
"@raze__serde_urlencoded__0_7_1//:serde_urlencoded",
|
||||
"@raze__url__2_3_1//:url",
|
||||
] + selects.with_or({
|
||||
# cfg(not(target_arch = "wasm32"))
|
||||
(
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios-sim",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
): [
|
||||
"@raze__base64__0_13_0//:base64",
|
||||
"@raze__encoding_rs__0_8_31//:encoding_rs",
|
||||
"@raze__futures_core__0_3_24//:futures_core",
|
||||
"@raze__futures_util__0_3_24//:futures_util",
|
||||
"@raze__http_body__0_4_5//:http_body",
|
||||
"@raze__hyper__0_14_20//:hyper",
|
||||
"@raze__hyper_tls__0_5_0//:hyper_tls",
|
||||
"@raze__ipnet__2_5_0//:ipnet",
|
||||
"@raze__lazy_static__1_4_0//:lazy_static",
|
||||
"@raze__log__0_4_17//:log",
|
||||
"@raze__mime__0_3_16//:mime",
|
||||
"@raze__native_tls__0_2_10//:native_tls",
|
||||
"@raze__percent_encoding__2_2_0//:percent_encoding",
|
||||
"@raze__pin_project_lite__0_2_9//:pin_project_lite",
|
||||
"@raze__tokio__1_21_1//:tokio",
|
||||
"@raze__tokio_native_tls__0_3_0//:tokio_native_tls",
|
||||
"@raze__tokio_socks__0_5_1//:tokio_socks",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}) + selects.with_or({
|
||||
# cfg(windows)
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
): [
|
||||
"@raze__winreg__0_7_0//:winreg",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
||||
|
||||
# Unsupported target "badssl" with type "test" omitted
|
||||
|
||||
# Unsupported target "blocking" with type "test" omitted
|
||||
|
||||
# Unsupported target "brotli" with type "test" omitted
|
||||
|
||||
# Unsupported target "client" with type "test" omitted
|
||||
|
||||
# Unsupported target "cookie" with type "test" omitted
|
||||
|
||||
# Unsupported target "gzip" with type "test" omitted
|
||||
|
||||
# Unsupported target "multipart" with type "test" omitted
|
||||
|
||||
# Unsupported target "proxy" with type "test" omitted
|
||||
|
||||
# Unsupported target "redirect" with type "test" omitted
|
||||
|
||||
# Unsupported target "timeouts" with type "test" omitted
|
||||
|
||||
# Unsupported target "wasm_simple" with type "test" omitted
|
@ -1,153 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "blocking" with type "example" omitted
|
||||
|
||||
# Unsupported target "form" with type "example" omitted
|
||||
|
||||
# Unsupported target "json_dynamic" with type "example" omitted
|
||||
|
||||
# Unsupported target "json_typed" with type "example" omitted
|
||||
|
||||
# Unsupported target "simple" with type "example" omitted
|
||||
|
||||
# Unsupported target "tor_socks" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "reqwest",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
aliases = {
|
||||
},
|
||||
crate_features = [
|
||||
"__rustls",
|
||||
"__tls",
|
||||
"hyper-rustls",
|
||||
"json",
|
||||
"mime_guess",
|
||||
"multipart",
|
||||
"rustls",
|
||||
"rustls-native-certs",
|
||||
"rustls-tls",
|
||||
"rustls-tls-native-roots",
|
||||
"rustls-tls-webpki-roots",
|
||||
"serde_json",
|
||||
"socks",
|
||||
"stream",
|
||||
"tokio-rustls",
|
||||
"tokio-socks",
|
||||
"webpki-roots",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.11.3",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__bytes__1_2_1//:bytes",
|
||||
"@raze__http__0_2_8//:http",
|
||||
"@raze__hyper_timeout__0_4_1//:hyper_timeout",
|
||||
"@raze__mime_guess__2_0_4//:mime_guess",
|
||||
"@raze__serde__1_0_145//:serde",
|
||||
"@raze__serde_json__1_0_85//:serde_json",
|
||||
"@raze__serde_urlencoded__0_7_1//:serde_urlencoded",
|
||||
"@raze__url__2_3_1//:url",
|
||||
] + selects.with_or({
|
||||
# cfg(not(target_arch = "wasm32"))
|
||||
(
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios-sim",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
): [
|
||||
"@raze__base64__0_13_0//:base64",
|
||||
"@raze__encoding_rs__0_8_31//:encoding_rs",
|
||||
"@raze__futures_core__0_3_24//:futures_core",
|
||||
"@raze__futures_util__0_3_24//:futures_util",
|
||||
"@raze__http_body__0_4_5//:http_body",
|
||||
"@raze__hyper__0_14_20//:hyper",
|
||||
"@raze__hyper_rustls__0_22_1//:hyper_rustls",
|
||||
"@raze__ipnet__2_5_0//:ipnet",
|
||||
"@raze__lazy_static__1_4_0//:lazy_static",
|
||||
"@raze__log__0_4_17//:log",
|
||||
"@raze__mime__0_3_16//:mime",
|
||||
"@raze__percent_encoding__2_2_0//:percent_encoding",
|
||||
"@raze__pin_project_lite__0_2_9//:pin_project_lite",
|
||||
"@raze__rustls__0_19_1//:rustls",
|
||||
"@raze__rustls_native_certs__0_5_0//:rustls_native_certs",
|
||||
"@raze__tokio__1_21_1//:tokio",
|
||||
"@raze__tokio_rustls__0_22_0//:tokio_rustls",
|
||||
"@raze__tokio_socks__0_5_1//:tokio_socks",
|
||||
"@raze__webpki_roots__0_21_1//:webpki_roots",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}) + selects.with_or({
|
||||
# cfg(windows)
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
): [
|
||||
"@raze__winreg__0_7_0//:winreg",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
||||
|
||||
# Unsupported target "badssl" with type "test" omitted
|
||||
|
||||
# Unsupported target "blocking" with type "test" omitted
|
||||
|
||||
# Unsupported target "brotli" with type "test" omitted
|
||||
|
||||
# Unsupported target "client" with type "test" omitted
|
||||
|
||||
# Unsupported target "cookie" with type "test" omitted
|
||||
|
||||
# Unsupported target "gzip" with type "test" omitted
|
||||
|
||||
# Unsupported target "multipart" with type "test" omitted
|
||||
|
||||
# Unsupported target "proxy" with type "test" omitted
|
||||
|
||||
# Unsupported target "redirect" with type "test" omitted
|
||||
|
||||
# Unsupported target "timeouts" with type "test" omitted
|
||||
|
||||
# Unsupported target "wasm_simple" with type "test" omitted
|
@ -1,71 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
rust_library(
|
||||
name = "term",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
aliases = {
|
||||
},
|
||||
crate_features = [
|
||||
"default",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=term",
|
||||
"manual",
|
||||
],
|
||||
version = "0.7.0",
|
||||
proc_macro_deps = [
|
||||
"@raze__rustversion__1_0_9//:rustversion",
|
||||
],
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__dirs_next__2_0_0//:dirs_next",
|
||||
] + selects.with_or({
|
||||
# cfg(windows)
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
): [
|
||||
"@raze__winapi__0_3_9//:winapi",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
||||
|
||||
# Unsupported target "terminfo" with type "test" omitted
|
@ -1,62 +1,3 @@
|
||||
This folder integrates Rust crates.io fetching into Bazel.
|
||||
|
||||
To add or update dependencies, ensure a local Rust environment is available
|
||||
(eg `source tools/cargo-env`), then install cargo-raze from git, commit
|
||||
ef58cddd28b64a0c321356acd16201e52cccfb3f
|
||||
|
||||
Also install cargo-license:
|
||||
|
||||
```
|
||||
cargo install cargo-license
|
||||
```
|
||||
|
||||
After adding/updating dependencies in ../rslib/Cargo.toml, change to this
|
||||
folder and run:
|
||||
|
||||
$ python update.py
|
||||
|
||||
or
|
||||
|
||||
$ REPIN=1 python update.py
|
||||
|
||||
The former will apply added crates and adjusted version numbers, while leaving
|
||||
most crate versions alone. The latter will also update pinned dependencies to their
|
||||
latest compatible versions.
|
||||
|
||||
Note: cargo-raze does not currently work when run from Windows, and nobody
|
||||
has investigated why yet. For now, you'll need a Mac or Linux machine, or
|
||||
will need to run update.py from within WSL.
|
||||
|
||||
A couple of crates need extra work to build with Bazel, and are listed
|
||||
in ../Cargo.toml. For example:
|
||||
|
||||
```toml
|
||||
[package.metadata.raze.crates.pyo3.'*']
|
||||
compile_data_attr = "glob([\"**\"])"
|
||||
```
|
||||
|
||||
With minor version updates, you should not normally need to modify
|
||||
the entries in that file.
|
||||
|
||||
Because update.py modifies a lot of files in remote/, it makes it difficult to
|
||||
review in a PR, and the changes can sometimes break platforms like Windows. For
|
||||
this reason, please don't submit PRs that do minor version bumps - those will
|
||||
typically be done after stable releases. If you need a new crate for a feature
|
||||
you're working on, please raise it in an issue first.
|
||||
|
||||
## Reqwest
|
||||
|
||||
Things are complicated with reqwest at the moment, because:
|
||||
|
||||
- we're using a fork to implement better timeouts for syncing
|
||||
- we want to build it with different features on Linux (where we can't build a
|
||||
wheel that links to OpenSSL), and on other platforms.
|
||||
|
||||
For minor version bumps, update.py should take care of updating the versions of
|
||||
reqwest dependencies.
|
||||
|
||||
After making a big update to reqwest via an updated fork, the vendored
|
||||
BUILD.reqwest.\* files may need updating. To do that, comment native-tls from
|
||||
the features in rslib/Cargo.toml and run update.py, and copy the file in remote/
|
||||
over the old vendored file. Then comment the other two deps out, add native-tls
|
||||
back, and repeat the process.
|
||||
This folder used to contain Bazel-specific Rust integration, but now only
|
||||
contains our license-checking script, which should be run when using
|
||||
cargo update.
|
||||
|
3462
cargo/crates.bzl
3462
cargo/crates.bzl
File diff suppressed because it is too large
Load Diff
@ -17,6 +17,15 @@
|
||||
"license_file": null,
|
||||
"description": "A simple clean-room implementation of the Adler-32 checksum"
|
||||
},
|
||||
{
|
||||
"name": "aes",
|
||||
"version": "0.7.5",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/block-ciphers",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Pure Rust implementation of the Advanced Encryption Standard (a.k.a. Rijndael) including support for AES in counter mode (a.k.a. AES-CTR)"
|
||||
},
|
||||
{
|
||||
"name": "ahash",
|
||||
"version": "0.7.6",
|
||||
@ -56,7 +65,7 @@
|
||||
{
|
||||
"name": "anki",
|
||||
"version": "0.0.0",
|
||||
"authors": "Ankitects Pty Ltd and contributors",
|
||||
"authors": "Ankitects Pty Ltd and contributors <https://help.ankiweb.net>",
|
||||
"repository": null,
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"license_file": null,
|
||||
@ -161,6 +170,15 @@
|
||||
"license_file": null,
|
||||
"description": "encodes and decodes base64 as bytes or utf8"
|
||||
},
|
||||
{
|
||||
"name": "base64ct",
|
||||
"version": "1.5.3",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/formats/tree/master/base64ct",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Pure Rust implementation of Base64 (RFC 4648) which avoids any usages of data-dependent branches/LUTs and thereby provides portable \"best effort\" constant-time operation and embedded-friendly no_std support"
|
||||
},
|
||||
{
|
||||
"name": "bitflags",
|
||||
"version": "1.3.2",
|
||||
@ -224,6 +242,24 @@
|
||||
"license_file": null,
|
||||
"description": "Types and traits for working with bytes"
|
||||
},
|
||||
{
|
||||
"name": "bzip2",
|
||||
"version": "0.4.3",
|
||||
"authors": "Alex Crichton <alex@alexcrichton.com>",
|
||||
"repository": "https://github.com/alexcrichton/bzip2-rs",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Bindings to libbzip2 for bzip2 compression and decompression exposed as Reader/Writer streams."
|
||||
},
|
||||
{
|
||||
"name": "bzip2-sys",
|
||||
"version": "0.1.11+1.0.8",
|
||||
"authors": "Alex Crichton <alex@alexcrichton.com>",
|
||||
"repository": "https://github.com/alexcrichton/bzip2-rs",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Bindings to libbzip2 for bzip2 compression and decompression exposed as Reader/Writer streams."
|
||||
},
|
||||
{
|
||||
"name": "cc",
|
||||
"version": "1.0.73",
|
||||
@ -251,6 +287,15 @@
|
||||
"license_file": null,
|
||||
"description": "Date and time library for Rust"
|
||||
},
|
||||
{
|
||||
"name": "cipher",
|
||||
"version": "0.3.0",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/traits",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Traits for describing block ciphers and stream ciphers"
|
||||
},
|
||||
{
|
||||
"name": "coarsetime",
|
||||
"version": "0.1.22",
|
||||
@ -296,6 +341,15 @@
|
||||
"license_file": null,
|
||||
"description": "Bindings to Core Foundation for macOS"
|
||||
},
|
||||
{
|
||||
"name": "cpufeatures",
|
||||
"version": "0.2.5",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/utils",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Lightweight runtime CPU feature detection for x86/x86_64 and aarch64 with no_std support and support for mobile targets including Android and iOS"
|
||||
},
|
||||
{
|
||||
"name": "crc32fast",
|
||||
"version": "1.3.2",
|
||||
@ -719,6 +773,15 @@
|
||||
"license_file": null,
|
||||
"description": "Encoding and decoding data into/from hexadecimal representation."
|
||||
},
|
||||
{
|
||||
"name": "hmac",
|
||||
"version": "0.12.1",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/MACs",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Generic implementation of Hash-based Message Authentication Code (HMAC)"
|
||||
},
|
||||
{
|
||||
"name": "html5ever",
|
||||
"version": "0.26.0",
|
||||
@ -1153,7 +1216,7 @@
|
||||
},
|
||||
{
|
||||
"name": "num_cpus",
|
||||
"version": "1.13.1",
|
||||
"version": "1.14.0",
|
||||
"authors": "Sean McArthur <sean@seanmonstar.com>",
|
||||
"repository": "https://github.com/seanmonstar/num_cpus",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
@ -1205,6 +1268,15 @@
|
||||
"license_file": null,
|
||||
"description": "Single assignment cells and lazy values."
|
||||
},
|
||||
{
|
||||
"name": "opaque-debug",
|
||||
"version": "0.3.0",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/utils",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Macro for opaque Debug trait implementation"
|
||||
},
|
||||
{
|
||||
"name": "openssl",
|
||||
"version": "0.10.41",
|
||||
@ -1259,6 +1331,24 @@
|
||||
"license_file": null,
|
||||
"description": "An advanced API for creating custom synchronization primitives."
|
||||
},
|
||||
{
|
||||
"name": "password-hash",
|
||||
"version": "0.4.2",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/traits/tree/master/password-hash",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Traits which describe the functionality of password hashing algorithms, as well as a `no_std`-friendly implementation of the PHC string format (a well-defined subset of the Modular Crypt Format a.k.a. MCF)"
|
||||
},
|
||||
{
|
||||
"name": "pbkdf2",
|
||||
"version": "0.11.0",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/password-hashes/tree/master/pbkdf2",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Generic implementation of PBKDF2"
|
||||
},
|
||||
{
|
||||
"name": "pct-str",
|
||||
"version": "1.1.0",
|
||||
@ -1439,15 +1529,6 @@
|
||||
"license_file": null,
|
||||
"description": "Procedural macros in expression position"
|
||||
},
|
||||
{
|
||||
"name": "proc-macro-nested",
|
||||
"version": "0.1.7",
|
||||
"authors": "David Tolnay <dtolnay@gmail.com>",
|
||||
"repository": "https://github.com/dtolnay/proc-macro-hack",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Support for nested proc-macro-hack invocations"
|
||||
},
|
||||
{
|
||||
"name": "proc-macro2",
|
||||
"version": "1.0.43",
|
||||
@ -1729,7 +1810,7 @@
|
||||
},
|
||||
{
|
||||
"name": "serde",
|
||||
"version": "1.0.145",
|
||||
"version": "1.0.147",
|
||||
"authors": "Erick Tryzelaar <erick.tryzelaar@gmail.com>|David Tolnay <dtolnay@gmail.com>",
|
||||
"repository": "https://github.com/serde-rs/serde",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
@ -1747,7 +1828,7 @@
|
||||
},
|
||||
{
|
||||
"name": "serde_derive",
|
||||
"version": "1.0.145",
|
||||
"version": "1.0.147",
|
||||
"authors": "Erick Tryzelaar <erick.tryzelaar@gmail.com>|David Tolnay <dtolnay@gmail.com>",
|
||||
"repository": "https://github.com/serde-rs/serde",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
@ -1756,7 +1837,7 @@
|
||||
},
|
||||
{
|
||||
"name": "serde_json",
|
||||
"version": "1.0.85",
|
||||
"version": "1.0.88",
|
||||
"authors": "Erick Tryzelaar <erick.tryzelaar@gmail.com>|David Tolnay <dtolnay@gmail.com>",
|
||||
"repository": "https://github.com/serde-rs/json",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
@ -1808,6 +1889,15 @@
|
||||
"license_file": null,
|
||||
"description": "Minimal dependency free implementation of SHA1 for Rust."
|
||||
},
|
||||
{
|
||||
"name": "sha1",
|
||||
"version": "0.10.5",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/hashes",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "SHA-1 hash function"
|
||||
},
|
||||
{
|
||||
"name": "sha1_smol",
|
||||
"version": "1.0.0",
|
||||
@ -1817,6 +1907,15 @@
|
||||
"license_file": null,
|
||||
"description": "Minimal dependency free implementation of SHA1 for Rust."
|
||||
},
|
||||
{
|
||||
"name": "sha2",
|
||||
"version": "0.10.6",
|
||||
"authors": "RustCrypto Developers",
|
||||
"repository": "https://github.com/RustCrypto/hashes",
|
||||
"license": "Apache-2.0 OR MIT",
|
||||
"license_file": null,
|
||||
"description": "Pure Rust implementation of the SHA-2 hash function family including SHA-224, SHA-256, SHA-384, and SHA-512."
|
||||
},
|
||||
{
|
||||
"name": "signal-hook-registry",
|
||||
"version": "1.4.0",
|
||||
@ -2134,7 +2233,7 @@
|
||||
},
|
||||
{
|
||||
"name": "tokio",
|
||||
"version": "1.21.1",
|
||||
"version": "1.21.2",
|
||||
"authors": "Tokio Contributors <team@tokio.rs>",
|
||||
"repository": "https://github.com/tokio-rs/tokio",
|
||||
"license": "MIT",
|
||||
@ -2672,9 +2771,18 @@
|
||||
"license_file": null,
|
||||
"description": "Rust bindings to MS Windows Registry API"
|
||||
},
|
||||
{
|
||||
"name": "workspace-hack",
|
||||
"version": "0.1.0",
|
||||
"authors": null,
|
||||
"repository": null,
|
||||
"license": null,
|
||||
"license_file": null,
|
||||
"description": "workspace-hack package, managed by hakari"
|
||||
},
|
||||
{
|
||||
"name": "zip",
|
||||
"version": "0.6.2",
|
||||
"version": "0.6.3",
|
||||
"authors": "Mathijs van de Nes <git@mathijs.vd-nes.nl>|Marli Frost <marli@frost.red>|Ryan Levick <ryan.levick@gmail.com>",
|
||||
"repository": "https://github.com/zip-rs/zip.git",
|
||||
"license": "MIT",
|
||||
|
63
cargo/remote/BUILD.addr2line-0.17.0.bazel
vendored
63
cargo/remote/BUILD.addr2line-0.17.0.bazel
vendored
@ -1,63 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # Apache-2.0 from expression "Apache-2.0 OR MIT"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "addr2line" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "addr2line",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=addr2line",
|
||||
"manual",
|
||||
],
|
||||
version = "0.17.0",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__gimli__0_26_2//:gimli",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "correctness" with type "test" omitted
|
||||
|
||||
# Unsupported target "output_equivalence" with type "test" omitted
|
||||
|
||||
# Unsupported target "parse" with type "test" omitted
|
56
cargo/remote/BUILD.adler-1.0.2.bazel
vendored
56
cargo/remote/BUILD.adler-1.0.2.bazel
vendored
@ -1,56 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "0BSD OR (MIT OR Apache-2.0)"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "bench" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "adler",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=adler",
|
||||
"manual",
|
||||
],
|
||||
version = "1.0.2",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
151
cargo/remote/BUILD.ahash-0.7.6.bazel
vendored
151
cargo/remote/BUILD.ahash-0.7.6.bazel
vendored
@ -1,151 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "ahash_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.7.6",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
"@raze__version_check__0_9_4//:version_check",
|
||||
] + selects.with_or({
|
||||
# cfg(any(target_os = "linux", target_os = "android", target_os = "windows", target_os = "macos", target_os = "ios", target_os = "freebsd", target_os = "openbsd", target_os = "netbsd", target_os = "dragonfly", target_os = "solaris", target_os = "illumos", target_os = "fuchsia", target_os = "redox", target_os = "cloudabi", target_os = "haiku", target_os = "vxworks", target_os = "emscripten", target_os = "wasi"))
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
): [
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}) + selects.with_or({
|
||||
# cfg(not(all(target_arch = "arm", target_os = "none")))
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
): [
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
||||
|
||||
# Unsupported target "ahash" with type "bench" omitted
|
||||
|
||||
# Unsupported target "map" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "ahash",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
aliases = {
|
||||
},
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=ahash",
|
||||
"manual",
|
||||
],
|
||||
version = "0.7.6",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":ahash_build_script",
|
||||
] + selects.with_or({
|
||||
# cfg(any(target_os = "linux", target_os = "android", target_os = "windows", target_os = "macos", target_os = "ios", target_os = "freebsd", target_os = "openbsd", target_os = "netbsd", target_os = "dragonfly", target_os = "solaris", target_os = "illumos", target_os = "fuchsia", target_os = "redox", target_os = "cloudabi", target_os = "haiku", target_os = "vxworks", target_os = "emscripten", target_os = "wasi"))
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
): [
|
||||
"@raze__getrandom__0_2_7//:getrandom",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}) + selects.with_or({
|
||||
# cfg(not(all(target_arch = "arm", target_os = "none")))
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
): [
|
||||
"@raze__once_cell__1_15_0//:once_cell",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
||||
|
||||
# Unsupported target "bench" with type "test" omitted
|
||||
|
||||
# Unsupported target "map_tests" with type "test" omitted
|
||||
|
||||
# Unsupported target "nopanic" with type "test" omitted
|
57
cargo/remote/BUILD.aho-corasick-0.7.19.bazel
vendored
57
cargo/remote/BUILD.aho-corasick-0.7.19.bazel
vendored
@ -1,57 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"unencumbered", # Unlicense from expression "Unlicense OR MIT"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
rust_library(
|
||||
name = "aho_corasick",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=aho_corasick",
|
||||
"manual",
|
||||
],
|
||||
version = "0.7.19",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__memchr__2_5_0//:memchr",
|
||||
],
|
||||
)
|
63
cargo/remote/BUILD.ammonia-3.2.1.bazel
vendored
63
cargo/remote/BUILD.ammonia-3.2.1.bazel
vendored
@ -1,63 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "ammonia-cat" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "ammonia",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=ammonia",
|
||||
"manual",
|
||||
],
|
||||
version = "3.2.1",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__html5ever__0_26_0//:html5ever",
|
||||
"@raze__maplit__1_0_2//:maplit",
|
||||
"@raze__once_cell__1_15_0//:once_cell",
|
||||
"@raze__tendril__0_4_3//:tendril",
|
||||
"@raze__url__2_3_1//:url",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "version-numbers" with type "test" omitted
|
@ -1,57 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "time_zone" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "android_system_properties",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=android_system_properties",
|
||||
"manual",
|
||||
],
|
||||
version = "0.1.5",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__libc__0_2_133//:libc",
|
||||
],
|
||||
)
|
116
cargo/remote/BUILD.anyhow-1.0.65.bazel
vendored
116
cargo/remote/BUILD.anyhow-1.0.65.bazel
vendored
@ -1,116 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "anyhow_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "1.0.65",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
rust_library(
|
||||
name = "anyhow",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=anyhow",
|
||||
"manual",
|
||||
],
|
||||
version = "1.0.65",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":anyhow_build_script",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "compiletest" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_autotrait" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_backtrace" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_boxed" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_chain" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_context" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_convert" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_downcast" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_ensure" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_ffi" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_fmt" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_macros" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_repr" with type "test" omitted
|
||||
|
||||
# Unsupported target "test_source" with type "test" omitted
|
64
cargo/remote/BUILD.arc-swap-1.5.1.bazel
vendored
64
cargo/remote/BUILD.arc-swap-1.5.1.bazel
vendored
@ -1,64 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # Apache-2.0 from expression "Apache-2.0 OR MIT"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "background" with type "bench" omitted
|
||||
|
||||
# Unsupported target "int-access" with type "bench" omitted
|
||||
|
||||
# Unsupported target "track" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "arc_swap",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=arc-swap",
|
||||
"manual",
|
||||
],
|
||||
version = "1.5.1",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "random" with type "test" omitted
|
||||
|
||||
# Unsupported target "stress" with type "test" omitted
|
60
cargo/remote/BUILD.arrayref-0.3.6.bazel
vendored
60
cargo/remote/BUILD.arrayref-0.3.6.bazel
vendored
@ -1,60 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"restricted", # BSD-2-Clause from expression "BSD-2-Clause"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "array_refs" with type "example" omitted
|
||||
|
||||
# Unsupported target "array_refs_with_const" with type "example" omitted
|
||||
|
||||
# Unsupported target "simple-case" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "arrayref",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=arrayref",
|
||||
"manual",
|
||||
],
|
||||
version = "0.3.6",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
97
cargo/remote/BUILD.arrayvec-0.4.12.bazel
vendored
97
cargo/remote/BUILD.arrayvec-0.4.12.bazel
vendored
@ -1,97 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "arrayvec_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.4.12",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "arraystring" with type "bench" omitted
|
||||
|
||||
# Unsupported target "extend" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "arrayvec",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=arrayvec",
|
||||
"manual",
|
||||
],
|
||||
version = "0.4.12",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":arrayvec_build_script",
|
||||
"@raze__nodrop__0_1_14//:nodrop",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "serde" with type "test" omitted
|
||||
|
||||
# Unsupported target "tests" with type "test" omitted
|
62
cargo/remote/BUILD.arrayvec-0.7.2.bazel
vendored
62
cargo/remote/BUILD.arrayvec-0.7.2.bazel
vendored
@ -1,62 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "arraystring" with type "bench" omitted
|
||||
|
||||
# Unsupported target "extend" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "arrayvec",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=arrayvec",
|
||||
"manual",
|
||||
],
|
||||
version = "0.7.2",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "serde" with type "test" omitted
|
||||
|
||||
# Unsupported target "tests" with type "test" omitted
|
91
cargo/remote/BUILD.async-trait-0.1.57.bazel
vendored
91
cargo/remote/BUILD.async-trait-0.1.57.bazel
vendored
@ -1,91 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "async_trait_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.1.57",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
rust_proc_macro(
|
||||
name = "async_trait",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=async-trait",
|
||||
"manual",
|
||||
],
|
||||
version = "0.1.57",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":async_trait_build_script",
|
||||
"@raze__proc_macro2__1_0_43//:proc_macro2",
|
||||
"@raze__quote__1_0_21//:quote",
|
||||
"@raze__syn__1_0_100//:syn",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "compiletest" with type "test" omitted
|
||||
|
||||
# Unsupported target "test" with type "test" omitted
|
79
cargo/remote/BUILD.atty-0.2.14.bazel
vendored
79
cargo/remote/BUILD.atty-0.2.14.bazel
vendored
@ -1,79 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "atty" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "atty",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
aliases = {
|
||||
},
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=atty",
|
||||
"manual",
|
||||
],
|
||||
version = "0.2.14",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
] + selects.with_or({
|
||||
# cfg(unix)
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-apple-darwin",
|
||||
"@rules_rust//rust/platform:x86_64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:aarch64-apple-darwin",
|
||||
"@rules_rust//rust/platform:aarch64-apple-ios",
|
||||
"@rules_rust//rust/platform:aarch64-unknown-linux-gnu",
|
||||
"@rules_rust//rust/platform:x86_64-apple-ios",
|
||||
): [
|
||||
"@raze__libc__0_2_133//:libc",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}) + selects.with_or({
|
||||
# cfg(windows)
|
||||
(
|
||||
"@rules_rust//rust/platform:x86_64-pc-windows-msvc",
|
||||
): [
|
||||
"@raze__winapi__0_3_9//:winapi",
|
||||
],
|
||||
"//conditions:default": [],
|
||||
}),
|
||||
)
|
64
cargo/remote/BUILD.autocfg-1.1.0.bazel
vendored
64
cargo/remote/BUILD.autocfg-1.1.0.bazel
vendored
@ -1,64 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # Apache-2.0 from expression "Apache-2.0 OR MIT"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "integers" with type "example" omitted
|
||||
|
||||
# Unsupported target "paths" with type "example" omitted
|
||||
|
||||
# Unsupported target "traits" with type "example" omitted
|
||||
|
||||
# Unsupported target "versions" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "autocfg",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2015",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=autocfg",
|
||||
"manual",
|
||||
],
|
||||
version = "1.1.0",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "rustflags" with type "test" omitted
|
111
cargo/remote/BUILD.backtrace-0.3.66.bazel
vendored
111
cargo/remote/BUILD.backtrace-0.3.66.bazel
vendored
@ -1,111 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "backtrace_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "0.3.66",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
"@raze__cc__1_0_73//:cc",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "benchmarks" with type "bench" omitted
|
||||
|
||||
# Unsupported target "backtrace" with type "example" omitted
|
||||
|
||||
# Unsupported target "raw" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "backtrace",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=backtrace",
|
||||
"manual",
|
||||
],
|
||||
version = "0.3.66",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":backtrace_build_script",
|
||||
"@raze__addr2line__0_17_0//:addr2line",
|
||||
"@raze__cfg_if__1_0_0//:cfg_if",
|
||||
"@raze__libc__0_2_133//:libc",
|
||||
"@raze__miniz_oxide__0_5_4//:miniz_oxide",
|
||||
"@raze__object__0_29_0//:object",
|
||||
"@raze__rustc_demangle__0_1_21//:rustc_demangle",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "accuracy" with type "test" omitted
|
||||
|
||||
# Unsupported target "concurrent-panics" with type "test" omitted
|
||||
|
||||
# Unsupported target "long_fn_name" with type "test" omitted
|
||||
|
||||
# Unsupported target "skip_inner_frames" with type "test" omitted
|
||||
|
||||
# Unsupported target "smoke" with type "test" omitted
|
70
cargo/remote/BUILD.base64-0.13.0.bazel
vendored
70
cargo/remote/BUILD.base64-0.13.0.bazel
vendored
@ -1,70 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
# Unsupported target "benchmarks" with type "bench" omitted
|
||||
|
||||
# Unsupported target "base64" with type "example" omitted
|
||||
|
||||
# Unsupported target "make_tables" with type "example" omitted
|
||||
|
||||
rust_library(
|
||||
name = "base64",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=base64",
|
||||
"manual",
|
||||
],
|
||||
version = "0.13.0",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "decode" with type "test" omitted
|
||||
|
||||
# Unsupported target "encode" with type "test" omitted
|
||||
|
||||
# Unsupported target "helpers" with type "test" omitted
|
||||
|
||||
# Unsupported target "tests" with type "test" omitted
|
59
cargo/remote/BUILD.bitflags-1.3.2.bazel
vendored
59
cargo/remote/BUILD.bitflags-1.3.2.bazel
vendored
@ -1,59 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
rust_library(
|
||||
name = "bitflags",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=bitflags",
|
||||
"manual",
|
||||
],
|
||||
version = "1.3.2",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "basic" with type "test" omitted
|
||||
|
||||
# Unsupported target "compile" with type "test" omitted
|
98
cargo/remote/BUILD.blake3-1.3.1.bazel
vendored
98
cargo/remote/BUILD.blake3-1.3.1.bazel
vendored
@ -1,98 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"unencumbered", # CC0-1.0 from expression "CC0-1.0 OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
# buildifier: disable=out-of-order-load
|
||||
# buildifier: disable=load-on-top
|
||||
load(
|
||||
"@rules_rust//cargo:cargo_build_script.bzl",
|
||||
"cargo_build_script",
|
||||
)
|
||||
|
||||
cargo_build_script(
|
||||
name = "blake3_build_script",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
build_script_env = {
|
||||
},
|
||||
crate_features = [
|
||||
"default",
|
||||
"digest",
|
||||
"std",
|
||||
],
|
||||
crate_root = "build.rs",
|
||||
data = glob(["**"]),
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"manual",
|
||||
],
|
||||
version = "1.3.1",
|
||||
visibility = ["//visibility:private"],
|
||||
deps = [
|
||||
"@raze__cc__1_0_73//:cc",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "bench" with type "bench" omitted
|
||||
|
||||
rust_library(
|
||||
name = "blake3",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
"default",
|
||||
"digest",
|
||||
"std",
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=blake3",
|
||||
"manual",
|
||||
],
|
||||
version = "1.3.1",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
":blake3_build_script",
|
||||
"@raze__arrayref__0_3_6//:arrayref",
|
||||
"@raze__arrayvec__0_7_2//:arrayvec",
|
||||
"@raze__cfg_if__1_0_0//:cfg_if",
|
||||
"@raze__constant_time_eq__0_1_5//:constant_time_eq",
|
||||
"@raze__digest__0_10_5//:digest",
|
||||
],
|
||||
)
|
57
cargo/remote/BUILD.block-buffer-0.10.3.bazel
vendored
57
cargo/remote/BUILD.block-buffer-0.10.3.bazel
vendored
@ -1,57 +0,0 @@
|
||||
"""
|
||||
@generated
|
||||
cargo-raze crate build file.
|
||||
|
||||
DO NOT EDIT! Replaced on runs of cargo-raze
|
||||
"""
|
||||
|
||||
# buildifier: disable=load
|
||||
load("@bazel_skylib//lib:selects.bzl", "selects")
|
||||
|
||||
# buildifier: disable=load
|
||||
load(
|
||||
"@rules_rust//rust:defs.bzl",
|
||||
"rust_binary",
|
||||
"rust_library",
|
||||
"rust_proc_macro",
|
||||
"rust_test",
|
||||
)
|
||||
|
||||
package(default_visibility = [
|
||||
# Public for visibility by "@raze__crate__version//" targets.
|
||||
#
|
||||
# Prefer access through "//cargo", which limits external
|
||||
# visibility to explicit Cargo.toml dependencies.
|
||||
"//visibility:public",
|
||||
])
|
||||
|
||||
licenses([
|
||||
"notice", # MIT from expression "MIT OR Apache-2.0"
|
||||
])
|
||||
|
||||
# Generated Targets
|
||||
|
||||
rust_library(
|
||||
name = "block_buffer",
|
||||
srcs = glob(["**/*.rs"]),
|
||||
crate_features = [
|
||||
],
|
||||
crate_root = "src/lib.rs",
|
||||
data = [],
|
||||
edition = "2018",
|
||||
rustc_flags = [
|
||||
"--cap-lints=allow",
|
||||
],
|
||||
tags = [
|
||||
"cargo-raze",
|
||||
"crate-name=block-buffer",
|
||||
"manual",
|
||||
],
|
||||
version = "0.10.3",
|
||||
# buildifier: leave-alone
|
||||
deps = [
|
||||
"@raze__generic_array__0_14_6//:generic_array",
|
||||
],
|
||||
)
|
||||
|
||||
# Unsupported target "mod" with type "test" omitted
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user