Skip to content

Conversation

alinelena
Copy link
Member

this will add two containers, one with janus and jupyterhub and the other with marimo.

@alinelena alinelena requested a review from ElliottKasoar August 4, 2025 15:58
@alinelena alinelena marked this pull request as draft August 4, 2025 15:58
@alinelena
Copy link
Member Author

alinelena commented Aug 4, 2025

in general this works, we will need to be sure that we can make it work on stfc packages, we may need permissions that need to be granted to use. @ElliottKasoar @oerc0122 if you can have a look and see while I am on holiday.

we still need to

  • add some README, and docker and podman instructions
  • maybe even sign the images.
  • maybe add an apptainer.
  • build for more platforms.

@alinelena alinelena added enhancement New/improved feature or request documentation Improvements or additions to documentation labels Aug 4, 2025
Copy link
Member

@ElliottKasoar ElliottKasoar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get an error trying to build these images locally:

 => ERROR [17/18] RUN . /opt/conda/bin/activate &&     mamba env update --quiet --file /tmp/environment.yml &&     mamba clean --all -f -y &&     rm -rf "/home/jovyan/.cache"                         243.5s 
------                                                                                                                                                                                                        
 > [17/18] RUN . /opt/conda/bin/activate &&     mamba env update --quiet --file /tmp/environment.yml &&     mamba clean --all -f -y &&     rm -rf "/home/jovyan/.cache":                                      
11.17 warning  libmamba You are using 'pip' as an additional package manager.                                                                                                                                 
11.17     Be aware that packages installed with 'pip' are managed independently from 'conda-forge' channel.                                                                                                   
230.8   DEPRECATION: Building 'aseMolec' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'aseMolec'. Discussion can be found at https://github.com/pypa/pip/issues/6334
231.0   DEPRECATION: Building 'les' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'les'. Discussion can be found at https://github.com/pypa/pip/issues/6334
232.2   error: subprocess-exited-with-error
232.2   
232.2   × Building wheel for opentsne (pyproject.toml) did not run successfully.
232.2   │ exit code: 1
232.2   ╰─> [58 lines of output]
232.2       /tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpmp18qbq6/fftw3.c:1:10: fatal error: fftw3.h: No such file or directory
232.2           1 | #include <fftw3.h>
232.2             |          ^~~~~~~~~
232.2       compilation terminated.
232.2       /tmp/pip-build-env-sr9czyvo/overlay/lib/python3.12/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated.
232.2       !!
232.2       
232.2               ********************************************************************************
232.2               Please consider removing the following classifiers in favor of a SPDX license expression:
232.2       
232.2               License :: OSI Approved
232.2       
232.2               See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details.
232.2               ********************************************************************************
232.2       
232.2       !!
232.2         self._finalize_license_expression()
232.2       FFTW3 header files not found. Using numpy implementation of FFT.
232.2       running bdist_wheel
232.2       running build
232.2       running build_py
232.2       creating build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/__init__.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/sklearn.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/tsne.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/version.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/initialization.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/utils.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/callbacks.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/affinity.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/nearest_neighbors.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       copying openTSNE/metrics.py -> build/lib.linux-aarch64-cpython-312/openTSNE
232.2       creating build/lib.linux-aarch64-cpython-312/openTSNE/_matrix_mul
232.2       copying openTSNE/_matrix_mul/__init__.py -> build/lib.linux-aarch64-cpython-312/openTSNE/_matrix_mul
232.2       creating build/lib.linux-aarch64-cpython-312/openTSNE/dependencies
232.2       copying openTSNE/dependencies/__init__.py -> build/lib.linux-aarch64-cpython-312/openTSNE/dependencies
232.2       creating build/lib.linux-aarch64-cpython-312/openTSNE/dependencies/annoy
232.2       copying openTSNE/dependencies/annoy/__init__.py -> build/lib.linux-aarch64-cpython-312/openTSNE/dependencies/annoy
232.2       running build_ext
232.2       creating tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpszmvaeay
232.2       gcc -fno-strict-overflow -Wsign-compare -DNDEBUG -O3 -Wall -fPIC -O3 -isystem /opt/conda/include -fPIC -O3 -isystem /opt/conda/include -fPIC -I/opt/conda/include -I/opt/conda/Library/include -c /tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpszmvaeay/omp.c -o tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpszmvaeay/omp.o
232.2       gcc tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpszmvaeay/omp.o -o /tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpszmvaeay/omp.c
232.2       performance hint: openTSNE/quad_tree.pyx:73:0: Exception check on 'update_center_of_mass' will always require the GIL to be acquired.
232.2       Possible solutions:
232.2           1. Declare 'update_center_of_mass' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
232.2           2. Use an 'int' return type on 'update_center_of_mass' to allow an error code to be returned.
232.2       performance hint: openTSNE/quad_tree.pxd:21:18: No exception value declared for 'is_close' in pxd file.
232.2       Users cimporting this function and calling it without the gil will always require an exception check.
232.2       Suggest adding an explicit exception value.
232.2       warning: openTSNE/quad_tree.pyx:166:24: Not all members given for struct 'Node'
232.2       warning: openTSNE/quad_tree.pyx:166:24: Not all members given for struct 'Node'
232.2       Found openmp. Compiling with openmp flags...
232.2       Compiling openTSNE/quad_tree.pyx because it changed.
232.2       [1/1] Cythonizing openTSNE/quad_tree.pyx
232.2       building 'openTSNE.quad_tree' extension
232.2       creating build/temp.linux-aarch64-cpython-312/openTSNE
232.2       g++ -fno-strict-overflow -Wsign-compare -DNDEBUG -O3 -Wall -fPIC -O3 -isystem /opt/conda/include -fPIC -O3 -isystem /opt/conda/include -fPIC -I/opt/conda/include -I/opt/conda/Library/include -I/tmp/pip-build-env-sr9czyvo/overlay/lib/python3.12/site-packages/numpy/_core/include -I/opt/conda/include/python3.12 -c openTSNE/quad_tree.cpp -o build/temp.linux-aarch64-cpython-312/openTSNE/quad_tree.o -O3 -ffast-math -fno-finite-math-only -fno-associative-math -fopenmp
232.2       error: command 'g++' failed: No such file or directory
232.2       [end of output]
232.2   
232.2   note: This error originates from a subprocess, and is likely not a problem with pip.
232.2   ERROR: Failed building wheel for opentsne
232.2   DEPRECATION: Building 'bibtexparser' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'bibtexparser'. Discussion can be found at https://github.com/pypa/pip/issues/6334
235.8   DEPRECATION: Building 'nvidia-ml-py3' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'nvidia-ml-py3'. Discussion can be found at https://github.com/pypa/pip/issues/6334
236.0   DEPRECATION: Building 'plumed' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'plumed'. Discussion can be found at https://github.com/pypa/pip/issues/6334
242.4   DEPRECATION: Building 'antlr4-python3-runtime' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'antlr4-python3-runtime'. Discussion can be found at https://github.com/pypa/pip/issues/6334
242.5   DEPRECATION: Building 'python-hostlist' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'python-hostlist'. Discussion can be found at https://github.com/pypa/pip/issues/6334
242.7 error: failed-wheel-build-for-install
242.7 
242.7 × Failed to build installable wheels for some pyproject.toml based projects
242.7 ╰─> opentsne
243.4 critical libmamba pip failed to update packages
------
Dockerfile.jupyter:181
--------------------
 180 |     USER $NB_USER
 181 | >>> RUN . /opt/conda/bin/activate && \
 182 | >>>     mamba env update --quiet --file /tmp/environment.yml && \
 183 | >>>     mamba clean --all -f -y && \
 184 | >>>     rm -rf "/home/${NB_USER}/.cache"
 185 |     
--------------------
ERROR: failed to solve: process "/bin/sh -c . /opt/conda/bin/activate &&     mamba env update --quiet --file /tmp/environment.yml &&     mamba clean --all -f -y &&     rm -rf \"/home/${NB_USER}/.cache\"" did not complete successfully: exit code: 1

and

 => ERROR [stage-0 10/10] RUN uv pip install  --no-cache-dir cuequivariance==0.5.1   cuequivariance-torch==0.5.1   cuequivariance-ops-torch-cu12==0.5.1   torchvision   pack-mm   seaborn   data-tutori  2.3s 
------                                                                                                                                                                                                        
 > [stage-0 10/10] RUN uv pip install  --no-cache-dir cuequivariance==0.5.1   cuequivariance-torch==0.5.1   cuequivariance-ops-torch-cu12==0.5.1   torchvision   pack-mm   seaborn   data-tutorials   pymatviz   opentsne   pymatgen   pymatviz   git+https://github.com/imagdau/aseMolec@main:                                                                                                                             
0.223 Using Python 3.12.11 environment at: /usr/local                                                                                                                                                         
0.675    Updating https://github.com/imagdau/aseMolec (main)                                                                                                                                                  
1.206     Updated https://github.com/imagdau/aseMolec (01f304f4c828207db68ba801b32f716886edb646)                                                                                                              
2.206   × No solution found when resolving dependencies:
2.206   ╰─▶ Because cuequivariance-ops-cu12==0.5.1 has no wheels with
2.206       a matching platform tag (e.g., `manylinux_2_36_aarch64`)
2.206       and cuequivariance-ops-torch-cu12==0.5.1 depends on
2.206       cuequivariance-ops-cu12==0.5.1, we can conclude that
2.206       cuequivariance-ops-torch-cu12==0.5.1 cannot be used.
2.206       And because you require cuequivariance-ops-torch-cu12==0.5.1, we can
2.206       conclude that your requirements are unsatisfiable.
2.206 
2.206       hint: Wheels are available for `cuequivariance-ops-cu12` (v0.5.1) on the
2.206       following platforms: `manylinux_2_24_x86_64`, `manylinux_2_28_x86_64`,
2.206       `manylinux_2_39_aarch64`
------

 1 warning found (use docker --debug to expand):
 - JSONArgsRecommended: JSON arguments recommended for CMD to prevent unintended behavior related to OS signals (line 66)
Dockerfile.marimo:53
--------------------
  52 |     RUN uv pip install --no-cache-dir 'janus-core[all]@git+https://github.com/stfc/janus-core.git@main'
  53 | >>> RUN uv pip install  --no-cache-dir cuequivariance==0.5.1 \
  54 | >>>   cuequivariance-torch==0.5.1 \
  55 | >>>   cuequivariance-ops-torch-cu12==0.5.1 \
  56 | >>>   torchvision \
  57 | >>>   pack-mm \
  58 | >>>   seaborn \
  59 | >>>   data-tutorials \
  60 | >>>   pymatviz \
  61 | >>>   opentsne \
  62 | >>>   pymatgen \
  63 | >>>   pymatviz \
  64 | >>>   git+https://github.com/imagdau/aseMolec@main
  65 |     
--------------------
ERROR: failed to solve: process "/bin/sh -c uv pip install  --no-cache-dir cuequivariance==0.5.1   cuequivariance-torch==0.5.1   cuequivariance-ops-torch-cu12==0.5.1   torchvision   pack-mm   seaborn   data-tutorials   pymatviz   opentsne   pymatgen   pymatviz   git+https://github.com/imagdau/aseMolec@main" did not complete successfully: exit code: 1

For the first, it may just be missing g++, and for the latter, we may need to remove cuequivariance, but whether we want to try and fix these probably depends how platform agnostic we want these to be?

uses: docker/build-push-action@v5
with:
context: ./containers
file: ./containers/Dockerfile.jupyter
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this work with the relative environment.yml file?

If I run in janus-core: docker build -t janus-jupyter -f ./containers/Dockerfile.jupyter . I get an error: "/environment.yml": not found.

This is fine in containers, but then the file path suggests we're not? Or does the context set it after the Dockerfile is found?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

odd

alinelena and others added 2 commits August 11, 2025 09:11
Co-authored-by: ElliottKasoar <45317199+ElliottKasoar@users.noreply.github.com>
was there only for testing on all branches

Co-authored-by: ElliottKasoar <45317199+ElliottKasoar@users.noreply.github.com>
@alinelena
Copy link
Member Author

232.2 /tmp/pip-install-x0btd_ab/opentsne_d8f120e26e584cc2b6783b930ffbb522/tmpmp18qbq6/fftw3.c:1:10: fatal error: fftw3.h: No such file or directory

seems also this is needed... I wonder if arm base images are different than the amd64 ones

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New/improved feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants