diff --git a/examples/00-wrapping_numpy_capabilities.ipynb b/examples/00-wrapping_numpy_capabilities.ipynb new file mode 100644 index 00000000..ccb8988c --- /dev/null +++ b/examples/00-wrapping_numpy_capabilities.ipynb @@ -0,0 +1,129 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n\n# Write user defined Operator\nThis example shows how to create a simple DPF python plugin holding a single Operator.\nThis Operator called \"easy_statistics\" computes simple statistics quantities on a scalar Field with\nthe help of numpy.\nIt's a simple example displaying how routines can be wrapped in DPF python plugins.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write Operator\nTo write the simplest DPF python plugins, a single python script is necessary.\nAn Operator implementation deriving from :class:`ansys.dpf.core.custom_operator.CustomOperatorBase`\nand a call to :py:func:`ansys.dpf.core.custom_operator.record_operator` are the 2 necessary steps to create a plugin.\nThe \"easy_statistics\" Operator will take a Field in input and return the first quartile, the median,\nthe third quartile and the variance. The python Operator and its recording seat in the\nfile plugins/easy_statistics.py. This file `easy_statistics.py` is downloaded and displayed here:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from ansys.dpf.core import examples\n\nGITHUB_SOURCE_URL = \"https://github.com/pyansys/pydpf-core/raw/examples/first_python_plugins/python-plugins\"\nEXAMPLE_FILE = GITHUB_SOURCE_URL + \"/easy_statistics.py\"\noperator_file_path = examples.downloads._retrieve_file(EXAMPLE_FILE, \"easy_statistics.py\", \"python-plugins\")\n\nimport IPython\nprint(IPython.display.Code(operator_file_path))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load Plugin\nOnce a python plugin is written, it can be loaded with the function :py:func:`ansys.dpf.core.core.load_library`\ntaking as first argument the path to the directory of the plugin, as second argument ``py_`` + the name of\nthe python script, and as last argument the function's name used to record operators.\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import os\nfrom ansys.dpf import core as dpf\nfrom ansys.dpf.core import examples\n\noperator_server_file_path = dpf.upload_file_in_tmp_folder(operator_file_path)\ndpf.load_library(os.path.dirname(operator_server_file_path), \"py_easy_statistics\", \"load_operators\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the Operator loaded, it can be instantiated with:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "new_operator = dpf.Operator(\"easy_statistics\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To use this new Operator, a workflow computing the norm of the displacement\nis connected to the \"easy_statistics\" Operator.\nMethods of the class ``easy_statistics`` are dynamically added thanks to the Operator's\nspecification defined in the plugin.\n\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + ".. graphviz::\n\n digraph foo {\n graph [pad=\"0.5\", nodesep=\"0.3\", ranksep=\"0.3\"]\n node [shape=box, style=filled, fillcolor=\"#ffcc00\", margin=\"0\"];\n rankdir=LR;\n splines=line;\n ds [label=\"ds\", shape=box, style=filled, fillcolor=cadetblue2];\n ds -> displacement [style=dashed];\n displacement -> norm;\n norm -> easy_statistics;\n }\n\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Use the Custom Operator\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "ds = dpf.DataSources(dpf.upload_file_in_tmp_folder(examples.static_rst))\ndisplacement = dpf.operators.result.displacement(data_sources=ds)\nnorm = dpf.operators.math.norm(displacement)\nnew_operator.inputs.connect(norm)\n\n\nprint(\"first quartile is\", new_operator.outputs.first_quartile())\nprint(\"median is\", new_operator.outputs.median())\nprint(\"third quartile is\", new_operator.outputs.third_quartile())\nprint(\"variance is\", new_operator.outputs.variance())" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.13" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file diff --git a/examples/01-package_python_operators.ipynb b/examples/01-package_python_operators.ipynb new file mode 100644 index 00000000..a2c6bea3 --- /dev/null +++ b/examples/01-package_python_operators.ipynb @@ -0,0 +1,129 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n\n# Write user defined Operators as a package\nThis example shows how more complex DPF python plugins of Operators can be created as standard python packages.\nThe benefits of writing packages instead of simple scripts are: componentization (split the code in several\npython modules or files), distribution (with packages, standard python tools can be used to upload and\ndownload packages) and documentation (READMEs, docs, tests and examples can be added to the package).\n\nThis plugin will hold 2 different Operators:\n - One returning all the scoping ids having data higher than the average\n - One returning all the scoping ids having data lower than the average\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write Operator\nFor this more advanced use case, a python package is created.\nEach Operator implementation derives from :class:`ansys.dpf.core.custom_operator.CustomOperatorBase`\nand a call to :py:func:`ansys.dpf.core.custom_operator.record_operator` records the Operators of the plugin.\nThe python package `average_filter_plugin` is downloaded and displayed here:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import IPython\nimport os\nfrom ansys.dpf.core import examples\n\nprint('\\033[1m average_filter_plugin')\nfile_list = [\"__init__.py\", \"operators.py\", \"operators_loader.py\", \"common.py\"]\nplugin_folder = None\nGITHUB_SOURCE_URL = \"https://github.com/pyansys/pydpf-core/raw/examples/first_python_plugins/python-plugins/average_filter_plugin\"\n\nfor file in file_list:\n EXAMPLE_FILE = GITHUB_SOURCE_URL + \"/average_filter_plugin/\" + file\n operator_file_path = examples.downloads._retrieve_file(EXAMPLE_FILE, file, \"python-plugins/average_filter_plugin\")\n plugin_folder = os.path.dirname(operator_file_path)\n print(f'\\033[1m {file}:\\n \\033[0m')\n print('\\t\\t\\t'.join(('\\n' + str(IPython.display.Code(operator_file_path)).lstrip()).splitlines(True)))\n print(\"\\n\\n\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load Plugin\nOnce a python plugin is written as a package, it can be loaded with the function\n:py:func:`ansys.dpf.core.core.load_library` taking as first argument the path to the directory of the plugin,\nas second argument ``py_`` + any name identifying the plugin,\nand as last argument the function's name exposed in the __init__ file and used to record operators.\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import os\nfrom ansys.dpf import core as dpf\nfrom ansys.dpf.core import examples\n\n\ntmp = dpf.make_tmp_dir_server()\ndpf.upload_files_in_folder(\n dpf.path_utilities.join(tmp, \"average_filter_plugin\"),\n plugin_folder\n)\ndpf.load_library(\n os.path.join(dpf.path_utilities.join(tmp, \"average_filter_plugin\")),\n \"py_average_filter\",\n \"load_operators\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the Plugin loaded, Operators recorded in the plugin can be used with:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "new_operator = dpf.Operator(\"ids_with_data_lower_than_average\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To use this new Operator, a workflow computing the norm of the displacement\nis connected to the \"ids_with_data_lower_than_average\" Operator.\nMethods of the class ``ids_with_data_lower_than_average`` are dynamically added thanks to the Operator's\nspecification.\n\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + ".. graphviz::\n\n digraph foo {\n graph [pad=\"0.5\", nodesep=\"0.3\", ranksep=\"0.3\"]\n node [shape=box, style=filled, fillcolor=\"#ffcc00\", margin=\"0\"];\n rankdir=LR;\n splines=line;\n ds [label=\"ds\", shape=box, style=filled, fillcolor=cadetblue2];\n ds -> displacement [style=dashed];\n displacement -> norm;\n norm -> ids_with_data_lower_than_average;\n }\n\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Use the Custom Operator\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "ds = dpf.DataSources(dpf.upload_file_in_tmp_folder(examples.static_rst))\ndisplacement = dpf.operators.result.displacement(data_sources=ds)\nnorm = dpf.operators.math.norm(displacement)\nnew_operator.inputs.connect(norm)\n\n\nnew_scoping = new_operator.outputs.scoping()\nprint(\"scoping in was:\", norm.outputs.field().scoping)\nprint(\"----------------------------------------------\")\nprint(\"scoping out is:\", new_scoping)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.13" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file diff --git a/examples/02-python_operators_with_dependencies.ipynb b/examples/02-python_operators_with_dependencies.ipynb new file mode 100644 index 00000000..dabd4f61 --- /dev/null +++ b/examples/02-python_operators_with_dependencies.ipynb @@ -0,0 +1,165 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n\n# Write user defined Operators having third party dependencies\nThis example shows how advanced DPF python plugins of Operators can be created as standard python packages\nand how third party python modules dependencies can be added to the package.\nFor a first introduction on user defined python Operators see example `ref_wrapping_numpy_capabilities`\nand for a simpler example on user defined python Operators as a package see `ref_python_plugin_package`.\n\nThis plugin will hold an Operator which implementation depends on a third party python module named\n`gltf `_. This Operator takes a path, a mesh and 3D vector field in input\nand exports the mesh and the norm of the input field in a gltf file located at the given path.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write Operator\nFor this more advanced use case, a python package is created.\nEach Operator implementation derives from :class:`ansys.dpf.core.custom_operator.CustomOperatorBase`\nand a call to :py:func:`ansys.dpf.core.custom_operator.record_operator` records the Operators of the plugin.\nThe python package `gltf_plugin` is downloaded and displayed here:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import IPython\nimport os\nfrom ansys.dpf.core import examples\n\nprint('\\033[1m gltf_plugin')\nfile_list = [\"gltf_plugin/__init__.py\", \"gltf_plugin/operators.py\", \"gltf_plugin/operators_loader.py\",\n \"gltf_plugin/requirements.txt\", \"gltf_plugin/gltf_export.py\", \"gltf_plugin/texture.png\", \"gltf_plugin.xml\"]\nplugin_path = None\nGITHUB_SOURCE_URL = \"https://github.com/pyansys/pydpf-core/raw/examples/first_python_plugins/python-plugins\"\n\nfor file in file_list:\n EXAMPLE_FILE = GITHUB_SOURCE_URL + \"/gltf_plugin/\" + file\n operator_file_path = examples.downloads._retrieve_file(\n EXAMPLE_FILE, file, os.path.join(\"python-plugins\", os.path.dirname(file)))\n\n print(f'\\033[1m {file}\\n \\033[0m')\n if (os.path.splitext(file)[1] == \".py\" or os.path.splitext(file)[1] == \".xml\") and file != \"gltf_plugin/gltf_export.py\":\n print('\\t\\t\\t'.join(('\\n' + str(IPython.display.Code(operator_file_path)).lstrip()).splitlines(True)))\n print(\"\\n\\n\")\n if plugin_path is None:\n plugin_path = os.path.dirname(operator_file_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To add third party modules as dependencies to a custom DPF python plugin, a folder or zip file\nwith the sites of the dependencies needs to be created and referenced in an xml located next to the plugin's folder\nand having the same name as the plugin plus the ``.xml`` extension. The ``site`` python module is used by DPF when\ncalling :py:func:`ansys.dpf.core.core.load_library` function to add these custom sites to the python interpreter path.\nTo create these custom sites, the requirements of the custom plugin should be installed in a python virtual\nenvironment, the site-packages (with unnecessary folders removed) should be zipped and put with the plugin. The\npath to this zip should be referenced in the xml as done above.\n\nTo simplify this step, a requirements file can be added in the plugin, like:\n\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "print(f'\\033[1m gltf_plugin/requirements.txt: \\n \\033[0m')\nprint('\\t', IPython.display.Code(os.path.join(plugin_path, \"requirements.txt\")))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And this :download:`powershell script ` for windows or\nthis :download:`shell script ` can be ran with the mandatory arguments:\n\n- -pluginpath : path to the folder of the plugin.\n- -zippath : output zip file name.\n\noptional arguments are:\n\n- -pythonexe : path to a python executable of your choice.\n- -tempfolder : path to a temporary folder to work on, default is the environment variable ``TEMP`` on Windows and /tmp/ on Linux.\n\nFor windows powershell, call::\n\n create_sites_for_python_operators.ps1 -pluginpath /path/to/plugin -zippath /path/to/plugin/assets/winx64.zip\n\nFor linux shell, call::\n\n create_sites_for_python_operators.sh -pluginpath /path/to/plugin -zippath /path/to/plugin/assets/linx64.zip\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import subprocess\n\nif os.name == \"nt\" and not os.path.exists(os.path.join(plugin_path, 'assets', 'gltf_sites_winx64.zip')):\n CMD_FILE_URL = GITHUB_SOURCE_URL + \"/create_sites_for_python_operators.ps1\"\n cmd_file = examples.downloads._retrieve_file(CMD_FILE_URL, \"create_sites_for_python_operators.ps1\",\n \"python-plugins\")\n run_cmd = f\"powershell {cmd_file}\"\n args = f\" -pluginpath \\\"{plugin_path}\\\" -zippath {os.path.join(plugin_path, 'assets', 'gltf_sites_winx64.zip')}\"\n print(run_cmd + args)\n process = subprocess.run(run_cmd + args, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n if process.stderr:\n raise RuntimeError(\n \"Installing pygltf in a virtual environment failed with error:\\n\" + process.stderr.decode())\n else:\n print(\"Installing pygltf in a virtual environment succeeded\")\nelif os.name == \"posix\" and not os.path.exists(os.path.join(plugin_path, 'assets', 'gltf_sites_linx64.zip')):\n CMD_FILE_URL = GITHUB_SOURCE_URL + \"/create_sites_for_python_operators.sh\"\n cmd_file = examples.downloads._retrieve_file(CMD_FILE_URL, \"create_sites_for_python_operators.ps1\",\n \"python-plugins\")\n run_cmd = f\"{cmd_file}\"\n args = f\" -pluginpath \\\"{plugin_path}\\\" -zippath \\\"{os.path.join(plugin_path, 'assets', 'gltf_sites_linx64.zip')}\\\"\"\n print(run_cmd + args)\n os.system(f\"chmod u=rwx,o=x {cmd_file}\")\n os.system(run_cmd + args)\n print(\"\\nInstalling pygltf in a virtual environment succeeded\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load Plugin\nOnce a python plugin is written as a package, it can be loaded with the function\n:py:func:`ansys.dpf.core.core.load_library` taking as first argument the path to the directory of the plugin,\nas second argument ``py_`` + any name identifying the plugin,\nand as last argument the function's name exposed in the ``__init__.py`` file and used to record operators.\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from ansys.dpf import core as dpf\nfrom ansys.dpf.core import examples\n\ntmp = dpf.make_tmp_dir_server()\ndpf.upload_files_in_folder(\n dpf.path_utilities.join(tmp, \"plugins\", \"gltf_plugin\"),\n plugin_path\n)\ndpf.upload_file(\n plugin_path + \".xml\",\n dpf.path_utilities.join(tmp, \"plugins\", \"gltf_plugin.xml\")\n)\n\ndpf.load_library(\n dpf.path_utilities.join(tmp, \"plugins\", \"gltf_plugin\"),\n \"py_dpf_gltf\",\n \"load_operators\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the Plugin loaded, Operators recorded in the plugin can be used with:\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "new_operator = dpf.Operator(\"gltf_export\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This new Operator ``gltf_export`` requires a triangle surface mesh, a displacement Field on this surface mesh\nas well as an export path as inputs.\nTo demo this new Operator, a :class:`ansys.dpf.core.model.Model` on a simple file is created,\n:class:`ansys.dpf.core.operators.mesh.tri_mesh_skin` Operator is used to extract the surface of the mesh in triangles\nelements.\n\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Use the Custom Operator\n\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import os\n\nmodel = dpf.Model(dpf.upload_file_in_tmp_folder(examples.static_rst))\n\nmesh = model.metadata.meshed_region\nskin_mesh = dpf.operators.mesh.tri_mesh_skin(mesh=mesh)\n\ndisplacement = model.results.displacement()\ndisplacement.inputs.mesh_scoping(skin_mesh)\ndisplacement.inputs.mesh(skin_mesh)\nnew_operator.inputs.path(os.path.join(tmp, \"out\"))\nnew_operator.inputs.mesh(skin_mesh)\nnew_operator.inputs.field(displacement.outputs.fields_container()[0])\nnew_operator.run()\n\nprint(\"operator ran successfully\")\n\ndpf.download_file(os.path.join(tmp, \"out.glb\"), os.path.join(os.getcwd(), \"out.glb\"))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The gltf Operator output can be downloaded :download:`here `.\n\n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.9.13" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file