Building
Currently we support building against standalone, Houdini, Maya, Nuke on Linux and Windows. If you don't want to self-compile, you can also download pre-compiled builds on our release page. To load the resolver, you must specify a few environment variables, see our Resolvers > Environment Variables section for more details.
Setting up our build environment
After installing the requirements, we first need to set a couple of environment variables that our cmake file depends on.
Using our convenience setup script
On Linux we provide a bash script that you can source that sets up our development environment. This sets a few environment variables needed to build the resolver as well as for your USD capable application to load it. This can be done by running the following from the source directory:
source setup.sh
In the setup.sh file you can define what resolver to compile by setting the AR_RESOLVER_NAME
variable to one of the resolvers listed in resolvers in camelCase syntax (for example fileResolver
or pythonResolver
). Here you'll also have to define what application version to compile against.
It will then automatically set the PATH
, PYTHONPATH
, PXR_PLUGINPATH_NAME
and LD_LIBRARY_PATH
environment variables to the correct paths so that after your run the compile, the resolver will be loaded correctly (e.g. if you launch Houdini via houdinifx
, it will load everything correctly). The build process also logs this information again.
By default it also sets the TF_DEBUG
env var to AR_RESOLVER_INIT
so that you'll get logs of what resolver is loaded by USD's plugin system, which you can use to verify that everything is working correctly.
Manually setting up the environment
If you don't want to use our convenience script, you can also setup the environment manually.
# Linux
## Standalone
export AR_DCC_NAME=standalone
export USD_STANDALONE_ROOT="/path/to/usd/standalone/root"
## Houdini
export AR_DCC_NAME=houdini
export HFS="/path/to/houdini/root" # For example "/opt/hfs<HoudiniVersion>"
## Maya
export AR_DCC_NAME=maya
export MAYA_USD_SDK_ROOT="/path/to/maya/usd/sdk/root/.../mayausd/USD"
export MAYA_USD_SDK_DEVKIT_ROOT="/path/to/maya/usd/sdk/root/.../content/of/devkit.zip"
export PYTHON_ROOT="/path/to/python/root"
## Nuke
export AR_DCC_NAME=nuke
export NUKE_ROOT="/path/to/nuke/root"
export BOOST_ROOT="/path/to/boost/root" # The .../include/boost folder must be renamed to .../include/foundryboost
export TBB_ROOT="/path/to/tbb/root"
export PYTHON_ROOT="/path/to/python/root" # Windows only
## Resolver
export AR_RESOLVER_NAME=fileResolver
# Windows
## Standalone
set AR_DCC_NAME=standalone
set USD_STANDALONE_ROOT="/path/to/usd/standalone/root"
## Houdini
set AR_DCC_NAME=houdini
set HFS="/path/to/houdini/root" # For example "C:\Program Files\Side Effects Software\<HoudiniVersion>"
## Maya
set AR_DCC_NAME=maya
set MAYA_USD_SDK_ROOT="/path/to/maya/usd/sdk/root/.../mayausd/USD"
set MAYA_USD_SDK_DEVKIT_ROOT="/path/to/maya/usd/sdk/root/.../content/of/devkit.zip"
set PYTHON_ROOT="/path/to/python/root"
## Nuke
set AR_DCC_NAME=nuke
set NUKE_ROOT="/path/to/nuke/root"
set BOOST_ROOT="/path/to/boost/root" # The .../include/boost folder must be renamed to .../include/foundryboost
set TBB_ROOT="/path/to/tbb/root"
set PYTHON_ROOT="/path/to/python/root" # Windows only
## Resolver
set AR_RESOLVER_NAME=fileResolver
Running the build
To run the build, run:
# Linux
./build.sh
# Windows
build.bat
The build.sh/.bat
files also contain (commented out) the environment definition part above, so alternatively just comment out the lines and you are good to go.
Depending on app/USD build you are compiling against, there might be additional requirements to be aware of as documented below.
Standalone
To build against a standalone/pre-built USD distribution, simply specify the root folder via the AR_USD_STANDALONE_ROOT
environment variable.
We recommend using Nvidia's pre-compiled OpenUSD builds to avoid having to do a full custom USD build.
Houdini
Starting with Houdini 20, SideFX is offering gcc 11 builds that don't use the old Lib C ABI. Our automatic GitHub builds make use of this starting Houdini 20 and upwards. To make our CMake script still work with H19.5, we automatically switch to use the old ABI, if the Houdini version 19.5 is in the Houdini root folder path.
If you want to still build against gcc 9 (with the old Lib C ABI) with Houdini 20 and upwards, you'll need to set _GLIBCXX_USE_CXX11_ABI=0
as described below and make sure you have the right Houdini build installed.
If you want to enforce it manually, you'll need to update the line below in our main CMakeLists.txt file.
For gcc 9 builds Houdini uses the old Lib C ABI, so you'll need to set it to _GLIBCXX_USE_CXX11_ABI=0
, for gcc 11 to _GLIBCXX_USE_CXX11_ABI=1
.
See the official Release Notes for more information.
...
add_compile_definitions(_GLIBCXX_USE_CXX11_ABI=0)
...
Maya
Maya does not ship with python headers, we therefore need to self-compile python with the exact build version of the python build included with Maya distribution we intend to use.
Our build scripts then links against PYTHON_ROOT
env var specified python version. Alternatively the cmake file can be adjusted to only use the headers and link against the libs from Maya.
On Windows, the standard python installer ships with headers, so we can leverage those instead and avoid compilation.
On Linux, we either compile it ourselves or use our system package manager to install our python developer packages. This may not be available for all package managers, which is why we recommend building python ourselves.
Nuke
Nuke has the following additional requirements:
- Python (Windows Only): Nuke itself does not ship with the necessary python headers on Windows, instead only with the libs. We either have to self compile or alternatively link to an existing compatible python header folder. Our build script expects the root folder to by specified by the
PYTHON_ROOT
env var. - TBB: Nuke itself does not ship with the necessary TBB headers, instead only with the libs. We either have to self compile or alternatively link to an existing compatible TBB header folder. Our build script expects the root folder to by specified by the
TBB_ROOT
env var. - Boost: Nuke itself does not ship with the necessary boost headers, instead only with the libs. These are namespaced (file and symbol-wise) to
foundryboost
. To successfully compile, we'll have to self-compile boost and then copy/symlink the<root>/include/boost
folder to<root>/include/foundryboost
. Alternatively we can copy an existing compatible boost header folder to a new location and also copy/symlink it<root>/include/foundryboost
. This way we have identical headers for both symbols. Our build script expects the root folder to by specified by theBOOST_ROOT
env var.
Here is the boost situation explain in more detail:
- Nuke does not ship with boost headers
- Nuke namespaces boost symbols to foundryboost
- Nuke doesn't namespace boost (headers/files) itself, instead it namespaces maps them (by means unknown to us).
This way it can include the standard USD headers (that use <boost/...>), but compile to foundryboost symbols.
To solve this, we add a preprocessor definition to namespace boost to foundryboost and we duplicate/symlink
the
Testing the build
Unit tests are automatically run post-build on Linux using the standalone/Houdini/Maya/Nuke version you are using. You can find each resolvers tests in its respective src/<ResolverName>
/testenv folder.
Alternatively you can run your application and check if the resolver executes correctly. If you didn't use our convenience script as noted above, you'll have to specify a few environment variables, so that our plugin is correctly detected by USD.
Head over to our Resolvers > Environment Variables section on how to do this.
After that everything should run smoothly, you can try loading the examples in the "files" directory or work through our example setup section for a simple production example.
Customize build
If you want to further configure the build, you can head into the CMakeLists.txt in the root of this repo. In the first section of the file, you can configure various things, like the environment variables that the resolvers use, Python module namespaces and what resolvers to compile.
This is a standard CMakeLists.txt
file that you can also configure via CMake-GUI. If you don't want to use the build.sh
bash script, you can also configure and compile this project like any other C++ project via this file.
Documentation
If you want to locally build this documentation, you'll have to download mdBook and mdBook-admonish and add their parent directories to the PATH
env variable so that the executables are found.
You can do this via bash (after running source setup.sh
):
export MDBOOK_VERSION="0.4.28"
export MDBOOK_ADMONISH_VERSION="1.9.0"
curl -L https://github.com/rust-lang/mdBook/releases/download/v$MDBOOK_VERSION/mdbook-v$MDBOOK_VERSION-x86_64-unknown-linux-gnu.tar.gz | tar xz -C ${REPO_ROOT}/tools
curl -L https://github.com/tommilligan/mdbook-admonish/releases/download/v$MDBOOK_ADMONISH_VERSION/mdbook-admonish-v$MDBOOK_ADMONISH_VERSION-x86_64-unknown-linux-gnu.tar.gz | tar xz -C ${REPO_ROOT}/tools
export PATH=${REPO_ROOT}/tools:$PATH
You then can just run the following to build the documentation in html format:
./docs.sh
The documentation will then be built in docs/book.