Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: render aliases for extra_targets of whl_mods. #1903

Closed
wants to merge 1 commit into from

Conversation

lalten
Copy link
Contributor

@lalten lalten commented May 15, 2024

I'm trying to get rules_ros2 to Bzlmod. There is a dependency on numpy headers that are exposed via an additive_build_content cc_library. One of the last migration challenges is how to properly consume these headers in both Bzlmod and Workspace and without hardcoding a Python version. See mvukov/rules_ros2#238 (comment) for the WIP PR.

I dug a little into how the aliasing works and come up with a proof of concept of how a list of extra_targets could be propagated from pip.whl_mods like

pip = use_extension("@rules_python//python/extensions:pip.bzl", "pip")
pip.whl_mods(
    additive_build_content = """\
cc_library(
    name = "headers",
    hdrs = glob(["site-packages/numpy/core/include/numpy/**/*.h"]),
    includes = ["site-packages/numpy/core/include"],
    deps = ["@rules_python//python/cc:current_py_cc_headers"],
)
""",
    extra_targets = ["headers"],  # <--- this is new
    hub_name = "whl_mods_hub",
    whl_name = "numpy",
)
use_repo(pip, "whl_mods_hub")

This makes it possible to depend on "@rules_ros2_pip_deps//numpy:headers" instead of e.g. "@rules_ros2_pip_deps_310_numpy//:headers

Workspace doesn't have this problem, but the label looks slightly different.
That can be worked around with an alias for now.

load("@rules_python//python/private:bzlmod_enabled.bzl", "BZLMOD_ENABLED")
alias(
    name = "rules_ros2_pip_deps_numpy_headers",
    actual = "@rules_ros2_pip_deps//numpy:headers" if BZLMOD_ENABLED else "@rules_ros2_pip_deps_numpy//:headers",
    visibility = ["//visibility:public"],
)

You can see this in action in mvukov/rules_ros2@e2b926c

Please let me know what you think of this. Maybe there is a more elegant way to achieve this?

@lalten lalten requested a review from rickeylev as a code owner May 15, 2024 23:26
@lalten lalten mentioned this pull request May 15, 2024
@aignas
Copy link
Collaborator

aignas commented May 15, 2024

Thanks for this PR! At some point I had experimented with different ideas on this topic and this solution was one of them.

Other solutions that could achieve the goal in the repository context:

  • We could as well parse the additive_build_content and get all of the lines that start with name = " and that is how we could get the extra aliases that we need to get. That way the user does not need to define the target names in two places.
  • We could have regular patch file interface to hub repo BUILD.bazel files.

However, a better way could be to solve this in the build phase where we use one of the predefined targets and we can extract the headers, the easiest way that I could figure out was:

genrule(
    name = "numpy_headers",
    srcs = ["@pypi//numpy:whl"],
    outs = [
        # "numpy/core/include/numpy/__multiarray_api.c",
        # "numpy/core/include/numpy/__multiarray_api.h",
        # "numpy/core/include/numpy/__ufunc_api.c",
        # "numpy/core/include/numpy/__ufunc_api.h",
        "numpy/core/include/numpy/_dtype_api.h",
        "numpy/core/include/numpy/_neighborhood_iterator_imp.h",
        "numpy/core/include/numpy/_numpyconfig.h",
        "numpy/core/include/numpy/arrayobject.h",
        "numpy/core/include/numpy/arrayscalars.h",
        "numpy/core/include/numpy/experimental_dtype_api.h",
        "numpy/core/include/numpy/halffloat.h",
        "numpy/core/include/numpy/ndarrayobject.h",
        "numpy/core/include/numpy/ndarraytypes.h",
        "numpy/core/include/numpy/noprefix.h",
        "numpy/core/include/numpy/npy_1_7_deprecated_api.h",
        "numpy/core/include/numpy/npy_3kcompat.h",
        "numpy/core/include/numpy/npy_common.h",
        "numpy/core/include/numpy/npy_cpu.h",
        "numpy/core/include/numpy/npy_endian.h",
        "numpy/core/include/numpy/npy_interrupt.h",
        "numpy/core/include/numpy/npy_math.h",
        "numpy/core/include/numpy/npy_no_deprecated_api.h",
        "numpy/core/include/numpy/npy_os.h",
        "numpy/core/include/numpy/numpyconfig.h",
        "numpy/core/include/numpy/old_defines.h",
        "numpy/core/include/numpy/random/bitgen.h",
        "numpy/core/include/numpy/random/distributions.h",
        "numpy/core/include/numpy/random/libdivide.h",
        "numpy/core/include/numpy/ufuncobject.h",
        "numpy/core/include/numpy/utils.h",
    ],
    cmd = "unzip $< -d $(RULEDIR) -x $(OUTS)",
)

I wonder if there are better ways to extracting filegroups from the whls. A custom build rule that extract particular files from a whl could be the best solution because that is something that can live in @rules_python and not in the third party repos.

The API that I think we could have could be

load("@rules_python//python:pip.bzl", "whl_filegroup")

whl_filegroup(
    name = "numpy_includes",
    whl = "@pypi//numpy:whl",
    prefixes = [
        "numpy/core/include/numpy",
    ],
)

Because each whl has a RECORD file, we could use that RECORD file as the index of files that we need to expose in the outputs of the build rule.

@lalten lalten mentioned this pull request May 16, 2024
github-merge-queue bot pushed a commit that referenced this pull request May 20, 2024
This adds a `whl_filegroup` rule which can take a wheel as input and
extract it, making the
output available to consumers. A pattern is allowed to better control
what files
are extracted.

This is handy to, e.g. get the numpy headers from the numpy wheel so
they can be
used in a cc_library. e.g., 

Closes #1903 
Fixes #1311
@lalten lalten deleted the extra-targets branch May 21, 2024 05:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants