diff --git a/docs/_static/thumbnail_sh_processing.png b/docs/_static/thumbnail_sh_processing.png new file mode 100644 index 0000000..681156e Binary files /dev/null and b/docs/_static/thumbnail_sh_processing.png differ diff --git a/docs/conf.py b/docs/conf.py index 762c073..7c3b6c0 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -154,4 +154,5 @@ 'gallery/interactive/fast_fourier_transform': '_static/thumbnail_fast_fourier_transform.png', 'gallery/interactive/pyfar_introduction': '_static/pyfar_pf_transparent.png', 'gallery/interactive/pyfar_interactive_plots' : '_static/thumbnail_pyfar_interactive_plots.png', + 'gallery/interactive/spherical_harmonic_hrtf_interpolation' : '_static/thumbnail_sh_processing.png' } diff --git a/docs/examples_gallery.rst b/docs/examples_gallery.rst index 539256e..e6fa9e9 100644 --- a/docs/examples_gallery.rst +++ b/docs/examples_gallery.rst @@ -35,3 +35,4 @@ Examples gallery :glob: gallery/interactive/pyfar_introduction.ipynb + gallery/interactive/spherical_harmonic_hrtf_interpolation.ipynb diff --git a/docs/gallery/interactive/spherical_harmonic_hrtf_interpolation.ipynb b/docs/gallery/interactive/spherical_harmonic_hrtf_interpolation.ipynb new file mode 100644 index 0000000..6be839e --- /dev/null +++ b/docs/gallery/interactive/spherical_harmonic_hrtf_interpolation.ipynb @@ -0,0 +1,636 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "header", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "# Spherical Harmonic Based HRTF Interpolation\n", + "\n", + "Spherical harmonics are orthogonal spherical basis functions that have many use cases in acoustics signal processing. Most notably, the real-valued spherical harmonics are the basis of Ambisonics sound field capture and reproduction systems used in many immersive audio formats. But more generally, spherical harmonics are used to describe, analyze, interpolate, and extrapolate sound fields. Examples for this are estimating the direction of arrival, diffuseness, or directional decay time of sound fields in rooms.\n", + "\n", + "In this notebook, we will look at another common use case: the representation, interpolation and rotation of so called head-related transfer functions (HRTFs). These transfer functions describe the sound propagation from a free field sound source to the left and right ear of a listener. HRTFs are fundamental for immersive audio rendering via headphones and are usually measured for sound sources distributed on a spherical sampling grid - which makes it natural to describe and process them in the spherical harmonic domain.\n", + "\n", + "We will use the [spharpy](https://spharpy.readthedocs.io/) Python package for spherical harmonic processing and take HRTF processing as a chance to introduce core concepts of spharpy and [pyfar](https://pyfar.org). For a complete documentation of the related packages and more examples, please visit [pyfar.org](https://pyfar.org).\n", + "\n", + "We recommend to visit the Notebooks about pyfar [audio](https://pyfar-gallery.readthedocs.io/en/latest/gallery/interactive/pyfar_audio_objects.html) and [coordinate](https://pyfar-gallery.readthedocs.io/en/latest/gallery/interactive/pyfar_coordinates.html) objects, and the Notebook about [binaural synthesis](https://pyfar-gallery.readthedocs.io/en/latest/gallery/static/binaural_synthesis.html) before continuing. We are also assuming that you are familiar with the basic concepts of spherical harmonics, which are for example detailed in [1] and [2].\n", + "\n", + "[1] Rafaely, B. (2019). Fundamentals of spherical array processing (2nd ed.). Springer. https://doi.org/10.1007/978-3-319-99561-8\n", + "\n", + "[2] Zotter, F., & Frank, M. (2019). Ambisonics. A practical 3D audio theory for recording, studio production, sound reinforcement, and virtual reality (Vol. 19). Springer Open. https://doi.org/10.1007/978-3-030-17207-7\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "imports", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "outputs": [], + "source": [ + "import spharpy\n", + "import pyfar as pf\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import pooch\n", + "from pyfar.plot.ticker import MultipleFractionFormatter, MultipleFractionLocator\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "download_instructions", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "## Load HRTFs\n", + "\n", + "Lets first load an HRTF data set as a pyfar *Signal* and the source positions as a pyfar *Coordinates* object. The code below, downloads a publicly available HRTF dataset and loads it into the variables `hrirs` and `sources`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "pooch_files", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "outputs": [], + "source": [ + "# Leave this as it is: This is the URL from which the data will be downloaded\n", + "# and a hash for checking if the download worked.\n", + "url = 'https://github.com/pyfar/files/raw/refs/heads/main/education/VAR_TUB/FABIAN_HRIR_measured_HATO_0.sofa?download='\n", + "hash = '83ebbcd9a09d17679b95d201c9775438c0bb1199d565c3fc7a25448a905cdc3c'\n", + "\n", + "file = pooch.retrieve(\n", + " url, hash, fname='FABIAN_HRIR_measured_HATO_0.sofa', path=None)\n", + "\n", + "# load HRIRs and source positions\n", + "hrirs, sources, _ = pf.io.read_sofa(file)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Inspect HRTFs and Source Positions\n", + "\n", + "Lets quickly inspect the data, we are working with by looking at the channel shape (`cshape`) of the loaded objects. The Signal contains HRIRs for $Q=11950$ source positions and $2$ ears (the left ear data is contained in `hrirs[:, 0]`, and the right ear data in `hrirs[:, 1]`)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "solution_1", + "locked": false, + "schema_version": 3, + "solution": true, + "task": false + } + }, + "outputs": [], + "source": [ + "hrirs.cshape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Accordingly, $11950$ source positions are stored in the Coordinates object" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sources.cshape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A quick plot of the source positions shows that they all have the same radius and represent a full-spherical sampling grid" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "spharpy.plot.scatter(sources)\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Accordingly, we can store them in a spharpy [SamplingSphere](https://spharpy.readthedocs.io/en/stable/classes/spharpy.coordinates.html) object which is specifically intended for point distributions on a sphere." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sources = spharpy.SamplingSphere.from_coordinates(sources)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spherical Harmonics\n", + "\n", + "We interpret it as a function on the sphere discretely sampled at the source positions. This yields the following vector \n", + "\n", + "$$\\mathbf{h} = [h_1, h_2, ..., h_Q]^\\mathrm{T}$$\n", + "\n", + "that represents a single sample of the HRIR or a single frequency bin of the HRTF for one ear and **all** $Q$ source positions.\n", + "\n", + "We can apply the spherical harmonic transform - also referred to as spherical Fourier transform - to $\\mathbf{h}$ to get the vector of $(N+1)^2$ spherical harmonic coefficients\n", + "\n", + "$$\\mathrm{h}_{nm} = \\mathbf{Y}^\\dagger \\, \\mathbf{h} = [h_{0,0}, h_{1,-1}, h_{1,-1}, h_{1,-1}, ..., , h_{N,N}]^\\mathrm{T}$$\n", + "\n", + "where order $n$ and degree $m$ up to the maximum spherical harmonic order $N$.\n", + "\n", + "In the above $\\mathbf{Y} \\isin \\mathbb{R}^{Q \\times (N+1)^2}$ denotes the matrix\n", + "\n", + "$$\n", + "\\mathbf{Y} = \\begin{bmatrix}\n", + "Y_0^0(\\theta_1, \\phi_1) & Y_1^{-1}(\\theta_1, \\phi_1) & Y_1^0(\\theta_1, \\phi_1) & \\cdots & Y_N^N(\\theta_1, \\phi_1) \\\\\n", + "Y_0^0(\\theta_2, \\phi_2) & Y_1^{-1}(\\theta_2, \\phi_2) & Y_1^0(\\theta_2, \\phi_2) & \\cdots & Y_N^N(\\ \\theta_2, \\phi_2) \\\\\n", + "\\vdots & \\vdots & \\vdots & \\ddots & \\vdots \\\\\n", + "Y_0^0(\\theta_Q, \\phi_Q) & Y_1^{-1}(\\theta_Q, \\phi_Q) & Y_1^0(\\theta_Q, \\phi_Q) & \\cdots & Y_N^N(\\theta_Q, \\phi_Q) \\end{bmatrix}\n", + "$$\n", + "\n", + "containing the real-valued spherical harmonic basis functions $Y_n^m$ evaluated at the colatitude $\\theta$ and the azimuth $\\phi$ angles of the source positions and $(\\cdot)^\\dagger$ is the pseudo inverse.\n", + "\n", + "Once $\\mathbf{h}_{nm}$ is computed, it can be used to interpolate the HRTFs at any source position $\\hat{\\theta}$, $\\hat{\\phi}$ by means of the inverse spherical harmonic transform\n", + "\n", + "$$\\hat{\\mathbf{h}} = \\mathbf{\\hat{Y}} \\, \\mathbf{h}_{nm}$$\n", + "\n", + "For a single target source position $\\mathbf{\\hat{Y}}$ is\n", + "\n", + "$$\\mathbf{\\hat{Y}} = [Y_0^0(\\hat{\\theta},\\hat{\\phi}), Y_1^{-1}(\\hat{\\theta},\\hat{\\phi}), Y_1^0(\\hat{\\theta},\\hat{\\phi}), Y_1^1(\\hat{\\theta},\\hat{\\phi}), ..., Y_N^N(\\hat{\\theta},\\hat{\\phi})]$$\n", + "\n", + "but it can contain any number of source positions in general.\n", + "\n", + "Because $\\mathbf{h}$ contains data for a single sample or frequency bin, the above is done separately for *each* sample or frequency bin." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spherical Harmonic Definition\n", + "\n", + "Before computing $\\mathrm{Y}$ and its pseudo inverse, we need to decide on the maximum spherical harmonic order and the spherical harmonic definition.\n", + "\n", + "This specific sampling grid supports spherical harmonic processing up to an order of approximately $N=32$ and for simplicity, we use the default spherical harmonic definition of spharpy.\n", + "\n", + "The below creates a [SphericalHarmonicDefinition](https://spharpy.readthedocs.io/en/stable/theory/spherical_harmonic_definition.html) object that conveniently documents, how the used spherical harmonics are defined." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "n_max = 16\n", + "sh_definition = spharpy.SphericalHarmonicDefinition(n_max)\n", + "\n", + "print(f'{sh_definition.basis_type = }')\n", + "print(f'{sh_definition.normalization = }')\n", + "print(f'{sh_definition.condon_shortley = }')\n", + "print(f'{sh_definition.channel_convention = }')\n", + "print(f'{sh_definition.n_max = }')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spherical Harmonic Matrix\n", + "\n", + "We can now create a `SphericalHarmonics` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "spherical_harmonics = spharpy.SphericalHarmonics.from_definition(\n", + " sh_definition, sources, inverse_method=\"pseudo_inverse\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Among other properties, it provides the spherical harmonic basis matrix $\\mathbf{Y}$ and its pseudo inverse $\\mathbf{Y}^\\dagger$.\n", + "These matrices are computed on demand and stored inside the `SphericalHarmonics` objects. They can be reused across multiple operations and are only recomputed if any property of the SphericalHarmonic object changes, for example if you set a different maximum spherical harmonic order." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "task_3", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "## Spherical Harmonic Transform\n", + "\n", + "We can now compute $\\mathrm{h}_{nm} = \\mathbf{Y}^\\dagger \\, \\mathbf{h}$. We do this with a single [pyfar.matrix_multiplication](https://pyfar.readthedocs.io/en/stable/classes/pyfar.audio.html#pyfar.matrix_multiplication) that performs the transform for all samples and both ears.\n", + "\n", + "Alternatively, the matrix multiplication operator `@` can be used to perform the transform. Note that using the operator always performs the multiplication in the frequency domain. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hrirs_nm = (spherical_harmonics.basis_inv @ hrirs).T" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The above yields a pyfar Signal object. It has two channels (left and right ear), $(N+1)^2$ (i.e. `(n_max+1)**2`) spherical harmonic coefficients, and 256 samples. Lets convert it to a `SphericalHarmonicSignal`, which conveniently stores the spherical harmonic data and sampling rate along with the spherical harmonic definition." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hrirs_nm = spharpy.SphericalHarmonicSignal.from_definition(\n", + " sh_definition, hrirs_nm.time, hrirs_nm.sampling_rate)\n", + "\n", + "print(f'{hrirs_nm.cshape = }')\n", + "print(f'{hrirs_nm.n_max = }')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For illustration, lets plot the left and right ear spherical harmonic signal of order 1 and degree 1. This is contained in `hrirs_nm[:, 1]` and contains the 'left/right'-component of the HRTF data set.\n", + "This corresponds to the first order dipole moment oriented in the $y$-axis as visualized below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "axs, _, cb = spharpy.plot.balloon_wireframe(sources, spherical_harmonics.basis[:, 1])\n", + "cb.ax.yaxis.set_major_locator(MultipleFractionLocator(1, 2, base=np.pi))\n", + "cb.ax.yaxis.set_major_formatter(MultipleFractionFormatter(1, 2, base=np.pi, base_str=r'\\pi'))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "ax = pf.plot.time_freq(hrirs_nm[:, 1], label=['left ear', 'right ear'])\n", + "ax[1].legend()\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spherical Harmonic Rotation\n", + "\n", + "A common manipulation is a rotation of the data in the spherical harmonic domain. One use case in Ambisonics is to rotate the sound field to counter head rotations of the listener during headphone playback. This creates a naturally stable sound scene that the listener can explore with head rotations.\n", + "\n", + "Mathematically, this is realized by a multiplication with a rotation matrix $\\mathbf{R}$\n", + "\n", + "$$\\mathbf{h}_{nm,\\mathrm{rot}} = \\mathbf{R} \\, \\mathbf{h}_{nm}$$\n", + "\n", + "In spharpy, arbitrary rotations can be applied with the [SphericalHarmonicRotation](https://spharpy.readthedocs.io/en/stable/modules/spharpy.transforms.html#spharpy.transforms.SphericalHarmonicRotation) class. Lets create a rotation by 90 degrees about the z-axis." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "rotation_angle = 90\n", + "rotation = spharpy.transforms.SphericalHarmonicRotation.from_euler(\n", + " 'z', [np.deg2rad(rotation_angle)])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "One way to apply the rotation would be to generate the rotation matrix $\\mathbf{R}$ using\n", + "\n", + "`R = rotation.as_spherical_harmonic_matrix(sh_definition)`\n", + "\n", + "and perform the matrix multiplication introduced above.\n", + "\n", + "In this example we will directly use the SphericalHarmonicRotation object and the SphericalHarmonicSignal" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hrirs_nm_rotated = rotation.apply(hrirs_nm)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Inverse Spherical Harmonic Transform\n", + "\n", + "We now perform the inverse transform introduced above as\n", + "\n", + "$$\\hat{\\mathbf{h}} = \\mathbf{Y} \\, \\mathbf{h}_{nm}$$\n", + "\n", + "Note two things:\n", + "\n", + "1. If $N$ is sufficently large, we get $\\hat{\\mathbf{h}} = \\mathbf{h}$ after applying the inverse transform. HRTFs require $N>32$, which means that you will see differences between $\\hat{\\mathbf{h}}$ and $\\mathbf{h}$ \n", + "2. In the example below $\\mathbf{Y}$ contains the source positions of the original HRTF data set. In general, it can contain one or multiple arbitrary source positions.\n", + "\n", + "Again, the transform for all time samples can be done with a single call of [pyfar.matrix_multiplication](https://pyfar.readthedocs.io/en/stable/classes/pyfar.audio.html#pyfar.matrix_multiplication). We obtain the interpolated and rotated HRIRs through the inverse transform in the following cell." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "solution_4", + "locked": false, + "schema_version": 3, + "solution": true, + "task": false + } + }, + "outputs": [], + "source": [ + "hrirs_interpolated = pf.matrix_multiplication(\n", + " (spherical_harmonics.basis, hrirs_nm), domain='time', axes=[(0, 1), (1, 0), (0, 1)])\n", + "hrirs_interpolated = pf.Signal(hrirs_interpolated, hrirs.sampling_rate)\n", + "\n", + "hrirs_interpolated_rotated = pf.matrix_multiplication(\n", + " (spherical_harmonics.basis, hrirs_nm_rotated), domain='time', axes=[(0, 1), (1, 0), (0, 1)])\n", + "hrirs_interpolated_rotated = pf.Signal(hrirs_interpolated_rotated, hrirs.sampling_rate)\n", + "\n", + "hrirs_interpolated.cshape" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "task_5", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "## Plot the Results\n", + "\n", + "The interpolated and rotated HRIRs can be visualized using one of the functions defined in `spharpy.plot`. In the cells below we will visualize the frequency spectrum at 1 kHz for all reconstructed source positions for the left ear. The figure uses a spherical map projection to visualize the data on the sphere. The magnitude is encoded in the color map." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "frequency = 1000\n", + "idx = hrirs_interpolated.find_nearest_frequency(frequency)\n", + "\n", + "pf.plot.use()\n", + "fig, axes = plt.subplots(2,1, subplot_kw={'projection': 'mollweide'}, layout='constrained')\n", + "\n", + "axes[0].set_title('Left ear interpolated HRTF (no rotation)')\n", + "spharpy.plot.pcolor_map(\n", + " sources, 20 * np.log10(np.abs(hrirs_interpolated.freq[:, 0, idx])),\n", + " ax=axes[0])\n", + "axes[0].grid(True)\n", + "axes[1].set_title(f'Left ear interpolated HRTF ( {np.round(rotation_angle, 1)} deg. rotation)')\n", + "spharpy.plot.pcolor_map(\n", + " sources, 20 * np.log10(np.abs(hrirs_interpolated_rotated.freq[:, 0, idx])),\n", + " ax=axes[1])\n", + "axes[1].grid(True)\n", + "for ax_i in axes:\n", + " ax_i.set_xticks(np.deg2rad([-90, 0, 90]))\n", + " ax_i.set_yticks(np.deg2rad([-45, 0, 45]))\n", + " ax_i.set_xlabel(r'Azimuth $(^{\\circ})$')\n", + " ax_i.set_ylabel(r'Elevation $(^{\\circ})$')\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For a more detailed look at the interpolation error, we visualize the full frequency spectrum for a single source position and the left and right ear. As mentioned above, you can see that the spherical harmonic processing introduced errors to the HRTF and the processed HRTFs will look different for each spherical harmonic order." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "solution_5a", + "locked": false, + "schema_version": 3, + "solution": true, + "task": false + } + }, + "outputs": [], + "source": [ + "azimuth_angle = 90\n", + "colatitude_angle = 90\n", + "radius = 1.7\n", + "\n", + "source_position = pf.Coordinates.from_spherical_colatitude(\n", + " np.deg2rad(azimuth_angle),\n", + " np.deg2rad(colatitude_angle),\n", + " radius)\n", + "idx, _ = sources.find_nearest(source_position)\n", + "idx = idx[0]\n", + "\n", + "# plot\n", + "ax = pf.plot.freq(\n", + " hrirs[idx], label=['original: left', 'original:right'])\n", + "pf.plot.freq(\n", + " hrirs_interpolated[idx], ls=\"--\", label=['processed: left', 'processed: right'])\n", + "\n", + "ax.set_ylim(-25, 25)\n", + "ax.legend(loc='lower left')\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "task_6", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "## Exploration\n", + "\n", + "Now that you have coded all the above, it's time to play. For example, you can try to the processing with different spherical harmonic orders to see how the order affects the results. You could also render the HRTFs and listen to the result via headphones. An example for this is given in the [binaural synthesis](https://pyfar-gallery.readthedocs.io/en/latest/gallery/static/binaural_synthesis.html) notebook." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "license", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "# License notice\n", + "\n", + "This notebook © 2026 by [the pyfar developers](https://github.com/orgs/pyfar/people) is licensed under [CC BY 4.0](http://creativecommons.org/licenses/by/4.0/?ref=chooser-v1)\n", + "\n", + "\"CC\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "watermark", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "source": [ + "# Watermark\n", + "\n", + "The following watermark might help others to install specific package versions that might be required to run the notebook. Please give at least the versions of Python, IPython, numpy , and scipy, major third party packagers (e.g., pytorch), and all used pyfar packages." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "nbgrader": { + "grade": false, + "grade_id": "watermark_code", + "locked": true, + "schema_version": 3, + "solution": false, + "task": false + } + }, + "outputs": [], + "source": [ + "%load_ext watermark\n", + "%watermark -v -m -p numpy,scipy,pyfar,sofar,spharpy,watermark" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "gallery (3.12.11)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/requirements.txt b/requirements.txt index a7158c1..8964fc9 100644 --- a/requirements.txt +++ b/requirements.txt @@ -32,4 +32,5 @@ sphinx-design sphinx-favicon watermark pytz -sphinx-copybutton \ No newline at end of file +sphinx-copybutton +pooch \ No newline at end of file