diff --git a/.gitignore b/.gitignore index a6cb892..933f12c 100644 --- a/.gitignore +++ b/.gitignore @@ -10,7 +10,7 @@ old/ doc/build doc/source/generated extra-packages -*.json + # vs-code .vscode/ diff --git a/.gitmodules b/.gitmodules new file mode 100644 index 0000000..cb83897 --- /dev/null +++ b/.gitmodules @@ -0,0 +1,3 @@ +[submodule "hololinked/system_host/assets/hololinked-server-swagger-api"] + path = hololinked/system_host/assets/hololinked-server-swagger-api + url = https://github.com/VigneshVSV/hololinked-server-swagger-api.git diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000..a649779 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,112 @@ + +# Contributing to hololinked + +First off, thanks for taking the time to contribute! + +All types of contributions are encouraged and valued. See the [Table of Contents](#table-of-contents) for different ways to help and details about how this project handles them. Please make sure to read the relevant section before making your contribution. It will make it a lot easier for us maintainers and smooth out the experience for all involved. The community looks forward to your contributions. 🎉 + +> And if you like the project, but just don't have time to contribute, that's fine. There are other easy ways to support the project and show your appreciation, which we would also be very happy about: +> - Star the project +> - Tweet about it +> - Create examples & refer this project in your project's readme +> - Mention the project at local meetups and tell your friends/colleagues + + +## Table of Contents + +- [I Have a Question](#i-have-a-question) +- [I Want To Contribute](#i-want-to-contribute) + - [Reporting Bugs](#reporting-bugs) + - [Suggesting Enhancements](#suggesting-enhancements) + - [Your First Code Contribution](#your-first-code-contribution) + - [Improving The Documentation](#improving-the-documentation) +- [Styleguides](#styleguides) + - [Commit Messages](#commit-messages) +- [Join The Project Team](#join-the-project-team) + + + +## I Have a Question + +> If you want to ask a question, one may first refer the available how-to section of the documentation [Documentation](https://hololinked.readthedocs.io/en/latest/index.html). If the documentation is insufficient for any reason including being poorly documented, one may open a new discussion in the Q&A section of GitHub discussions. +It is also advisable to search the internet for answers first. + +If you believe your question might also be a bug, it is best to search for existing [Issues](https://github.com/VigneshVSV/hololinked/issues) that might help you. In case you have found a suitable issue and still need clarification, you can write your question in this issue. If an issue is not found: +- Open an [Issue](https://github.com/VigneshVSV/hololinked/issues/new). +- Provide as much context as you can about what you're running into. +- Provide project and platform versions (nodejs, npm, etc), depending on what seems relevant. + +We will then take care of the issue as soon as possible. + +## I Want To Contribute + +> ### Legal Notice +> When contributing to this project, you must agree that you have authored 100% of the content, that you have the necessary rights to the content and that the content you contribute may be provided under the project license. + +### Reporting Bugs + + +#### Before Submitting a Bug Report + +A good bug report shouldn't leave others needing to chase you up for more information. Therefore, we ask you to investigate carefully, collect information and describe the issue in detail in your report. Please complete the following steps in advance to help us fix any potential bug as fast as possible. + +- Make sure that you are using the latest version. +- Determine if your bug is really a bug and not an error on your side e.g. using incompatible environment components/versions (Make sure that you have read the [documentation](https://hololinked.readthedocs.io/en/latest/index.html). If you are looking for support, you might want to check [this section](#i-have-a-question)). +- To see if other users have experienced (and potentially already solved) the same issue you are having, check if there is not already a bug report existing for your bug or error in the [bug tracker](https://github.com/VigneshVSV/hololinkedissues?q=label%3Abug). +- Also make sure to search the internet (including Stack Overflow) to see if users outside of the GitHub community have discussed the issue. +- Collect information about the bug: + - Stack trace (Traceback) + - OS, Platform and Version (Windows, Linux, macOS, x86, ARM) + - Version of the interpreter + - Possibly your input and the output + - Can you reliably reproduce the issue? And can you also reproduce it with older versions? + + +#### How Do I Submit a Good Bug Report? + +> You must never report security related issues, vulnerabilities or bugs including sensitive information to the issue tracker, or elsewhere in public. Instead sensitive bugs must be sent by email to vignesh.vaidyanathan@hololinked.dev. + + +We use GitHub issues to track bugs and errors. If you run into an issue with the project: + +- Open an [Issue](https://github.com/VigneshVSV/hololinked/issues/new). (Since we can't be sure at this point whether it is a bug or not, we ask you not to talk about a bug yet and not to label the issue.) +- Explain the behavior you would expect and the actual behavior. +- Please provide as much context as possible and describe the *reproduction steps* that someone else can follow to recreate the issue on their own. This usually includes your code. For good bug reports you should isolate the problem and create a reduced test case. +- Provide the information you collected in the previous section. + +Once it's filed: + +- The project team will label the issue accordingly. +- A team member will try to reproduce the issue with your provided steps. If there are no reproduction steps or no obvious way to reproduce the issue, the team will ask you for those steps and mark the issue as `needs-repro`. Bugs with the `needs-repro` tag will not be addressed until they are reproduced. +- If the team is able to reproduce the issue, it will be marked `needs-fix`, as well as possibly other tags (such as `critical`), and the issue will be left to be [implemented by someone](#your-first-code-contribution). + + + + +### Suggesting Enhancements + +This section guides you through submitting an enhancement suggestion for hololinked, **including completely new features and minor improvements to existing functionality**. Following these guidelines will help maintainers and the community to understand your suggestion and find related suggestions. + + +#### Before Submitting an Enhancement + +- Make sure that you are using the latest version. +- Read the [documentation](https://hololinked.readthedocs.io/en/latest/index.html) carefully and find out if the functionality is already covered, maybe by an individual configuration. +- Perform a [search](https://github.com/VigneshVSV/hololinked/discussions/categories/ideas) to see if the enhancement has already been suggested. If it has, add a comment to the existing idea instead of opening a new one. +- Find out whether your idea fits with the scope and aims of the project. It's up to you to make a strong case to convince the project's developers of the merits of this feature. + + +#### How Do I Submit a Good Enhancement Suggestion? + +Enhancement suggestions are tracked as [GitHub issues](https://github.com/VigneshVSV/hololinked/issues). + +- Use a **clear and descriptive title** for the issue to identify the suggestion. +- Provide a **step-by-step description of the suggested enhancement** in as many details as possible. +- **Describe the current behavior** and **explain which behavior you expected to see instead** and why. At this point you can also tell which alternatives do not work for you. +- You may want to **include screenshots and animated GIFs** which help you demonstrate the steps or point out the part which the suggestion is related to. You can use [this tool](https://www.cockos.com/licecap/) to record GIFs on macOS and Windows, and [this tool](https://github.com/colinkeenan/silentcast) or [this tool](https://github.com/GNOME/byzanz) on Linux. +- **Explain why this enhancement would be useful** to most hololinked users. You may also want to point out the other projects that solved it better and which could serve as inspiration. + + + +## Attribution +This guide is based on the **contributing-gen**. [Make your own](https://github.com/bttger/contributing-gen)! diff --git a/README.md b/README.md index 99fb808..4312906 100644 --- a/README.md +++ b/README.md @@ -2,46 +2,301 @@ ### Description -For beginners - `hololinked` is a pythonic package suited for instrument control and data acquisition over network using python. -
-For those familiar with RPC & web development - `hololinked` is a ZMQ-based RPC toolkit with customizable HTTP end-points. -The main goal is to develop a pythonic (& pure python) modern package for instrument control through network (SCADA), -along with native HTTP support for communication with browser clients for browser based UIs. +For beginners - `hololinked` is a server side pythonic package suited for instrumentation control and data acquisition over network, especially with HTTP. If you have a requirement to control and capture data from your hardware/instrumentation remotely through your network, show the data in a web browser/dashboard, use IoT tools, provide a Qt-GUI or run automated scripts, hololinked can help. One can start small from a single device, and if interested, move ahead to build a bigger system made of individual components. +

+For those familiar with RPC & web development - `hololinked` is a ZeroMQ-based Object Oriented RPC toolkit with customizable HTTP end-points. +The main goal is to develop a pythonic & pure python modern package for instrumentation control and data acquisition through network (SCADA), along with "reasonable" HTTP support for web development. -##### NOTE - The package is rather incomplete and uploaded only for API showcase and active development. Even RPC logic is not complete.
+[![Documentation Status](https://readthedocs.org/projects/hololinked/badge/?version=latest)](https://hololinked.readthedocs.io/en/latest/?badge=latest) [![Maintainability](https://api.codeclimate.com/v1/badges/913f4daa2960b711670a/maintainability)](https://codeclimate.com/github/VigneshVSV/hololinked/maintainability) ![PyPI](https://img.shields.io/pypi/v/hololinked?label=pypi%20package) ![PyPI - Downloads](https://img.shields.io/pypi/dm/hololinked) -- tutorial webpage - [readthedocs](https://hololinked.readthedocs.io/en/latest/examples/index.html) -- [example repository](https://github.com/VigneshVSV/hololinked-examples) -- [helper GUI](https://github.com/VigneshVSV/hololinked-portal) - viewer of object's methods, parameters and events. -- [web development examples](https://hololinked.dev/docs/category/spectrometer-gui) +### To Install + +From pip - ``pip install hololinked`` + +Or, clone the repository and install in develop mode `pip install -e .` for convenience. The conda env hololinked.yml can also help. + + +### Usage/Quickstart + +`hololinked` is compatible with the [Web of Things](https://www.w3.org/WoT/) recommended pattern for developing hardware/instrumentation control software. Each device or thing can be controlled systematically when their design in software is segregated into properties, actions and events. In object oriented terms for data acquisition: +- properties are validated get-set attributes of the class which may be used to model device settings, hold captured/computed data etc. +- actions are methods which issue commands to the device or run arbitrary python logic. +- events can asynchronously communicate/push data to a client, like alarm messages, streaming captured data etc. + +In `hololinked`, the base class which enables this classification is the `Thing` class. Any class that inherits the `Thing` class can instantiate properties, actions and events which become visible to a client in this segragated manner. For example, consider an optical spectrometer device, the following code is possible: + +#### Import Statements + +```python + +from hololinked.server import Thing, Property, action, Event +from hololinked.server.properties import String, Integer, Number, List +``` + +#### Definition of one's own hardware controlling class + +subclass from Thing class to "make a network accessible Thing": + +```python + +class OceanOpticsSpectrometer(Thing): + """ + OceanOptics spectrometers using seabreeze library. Device is identified by serial number. + """ + +``` + +#### Instantiating properties + +Say, we wish to make device serial number, integration time and the captured intensity as properties. There are certain predefined properties available like `String`, `Number`, `Boolean` etc. +or one may define one's own. To create properties: + +```python + +class OceanOpticsSpectrometer(Thing): + """class doc""" + + serial_number = String(default=None, allow_None=True, URL_path='/serial-number', + doc="serial number of the spectrometer to connect/or connected", + http_method=("GET", "PUT")) + # GET and PUT is default for reading and writing the property respectively. + # Use other HTTP methods if necessary. + + integration_time = Number(default=1000, bounds=(0.001, None), crop_to_bounds=True, + URL_path='/integration-time', + doc="integration time of measurement in milliseconds") + + intensity = List(default=None, allow_None=True, + doc="captured intensity", readonly=True, + fget=lambda self: self._intensity) + + def __init__(self, instance_name, serial_number, **kwargs): + super().__init__(instance_name=instance_name, serial_number=serial_number, **kwargs) + +``` + +In non-expert terms, properties look like class attributes however their data containers are instantiated at object instance level by default. +For example, the `integratime_time` property defined above as `Number`, whenever set/written, will be validated as a float or int, cropped to bounds and assigned as an attribute to each instance of the `OceanOpticsSpectrometer` class with an internally generated name. It is not necessary to know this internally generated name as the property value can be accessed again in any python logic, say, `print(self.integration_time)`. + +To overload the get-set (or read-write) of properties, one may do the following: + +```python +class OceanOpticsSpectrometer(Thing): + + integration_time = Number(default=1000, bounds=(0.001, None), crop_to_bounds=True, + URL_path='/integration-time', + doc="integration time of measurement in milliseconds") + + @integration_time.setter # by default called on http PUT method + def apply_integration_time(self, value : float): + self.device.integration_time_micros(int(value*1000)) + self._integration_time = int(value) + + @integration_time.getter # by default called on http GET method + def get_integration_time(self) -> float: + try: + return self._integration_time + except AttributeError: + return self.properties["integration_time"].default + +``` + +In this case, instead of generating a data container with an internal name, the setter method is called when `integration_time` property is set/written. One might add the hardware device driver (say, supplied by the manufacturer) logic here to apply the property onto the device. In the above example, there is not a way provided by lower level library to read the value from the device, so we store it in a variable after applying it and supply the variable back to the getter method. Normally, one would also want the getter to read from the device directly. + +Those familiar with Web of Things (WoT) terminology may note that these properties generate the property affordance schema to become accessible by the [node-wot](https://github.com/eclipse-thingweb/node-wot) client. An example of autogenerated property affordance for `integration_time` is as follows: + +```JSON +"integration_time": { + "title": "integration_time", + "description": "integration time of measurement in milliseconds", + "constant": false, + "readOnly": false, + "writeOnly": false, + "type": "number", + "forms": [{ + "href": "https://example.com/spectrometer/integration-time", + "op": "readproperty", + "htv:methodName": "GET", + "contentType": "application/json" + },{ + "href": "https://example.com/spectrometer/integration-time", + "op": "writeproperty", + "htv:methodName": "PUT", + "contentType": "application/json" + } + ], + "observable": false, + "minimum": 0.001 +}, +``` +The URL path segment `../spectrometer/..` in href field is taken from the `instance_name` which was specified in the `__init__`. This is a mandatory key word argument to the parent class `Thing` to generate a unique name/id for the instance. One should use URI compatible strings. + +#### Specify methods as actions + +decorate with `action` decorator on a python method to claim it as a network accessible method: + +```python + +class OceanOpticsSpectrometer(Thing): -### Already Existing Features + @action(URL_path='/connect', http_method="POST") # POST is default for actions + def connect(self, serial_number = None): + """connect to spectrometer with given serial number""" + if serial_number is not None: + self.serial_number = serial_number + self.device = Spectrometer.from_serial_number(self.serial_number) + self._wavelengths = self.device.wavelengths().tolist() +``` -Support is already present for the following: +Methods that are neither decorated with action decorator nor acting as getters-setters of properties remain as plain python methods and are **not** accessible on the network. -- decorate HTTP verbs directly on object's methods -- declare parameters (based on [`param`](https://param.holoviz.org/getting_started.html)) for validated object attributes on the network -- create named events (based on ZMQ) for asychronous communication with clients. These events are also tunneled as HTTP server-sent events -- control method execution and parameter(or attribute) write with custom finite state machine -- use serializer of your choice (except for HTTP) - Serpent, JSON, pickle etc. & extend serialization to suit your requirement (HTTP Server will support only JSON serializer) -- asyncio compatible - async RPC Server event-loop and async HTTP Server -- have flexibility in process architecture - run HTTP Server & python object in separate processes or in the same process, combine multiple objects in same server, use multiple - HTTP servers for same object etc.. +In WoT Terminology, again, such a method becomes specified as an action affordance: + +```JSON +"connect": { + "title": "connect", + "description": "connect to spectrometer with given serial number", + "forms": [ + { + "href": "https://example.com/spectrometer/connect", + "op": "invokeaction", + "htv:methodName": "POST", + "contentType": "application/json" + } + ], + "safe": true, + "idempotent": false, + "synchronous": true +}, + +``` + +#### Defining and pushing events + +create a named event using `Event` object that can push any arbitrary data: + +```python + + def __init__(self, instance_name, serial_number, **kwargs): + super().__init__(instance_name=instance_name, serial_number=serial_number, **kwargs) + self.measurement_event = Event(name='intensity-measurement', + URL_path='/intensity/measurement-event')) # only GET HTTP method possible for events + + def capture(self): # not an action, but a plain python method + self._run = True + while self._run: + self._intensity = self.device.intensities( + correct_dark_counts=True, + correct_nonlinearity=True + ) + self.measurement_event.push(self._intensity.tolist()) + + @action(URL_path='/acquisition/start', http_method="POST") + def start_acquisition(self): + self._acquisition_thread = threading.Thread(target=self.capture) + self._acquisition_thread.start() + + @action(URL_path='/acquisition/stop', http_method="POST") + def stop_acquisition(self): + self._run = False +``` + +In WoT Terminology, such an event becomes specified as an event affordance with subprotocol SSE: + +```JSON +"intensity_measurement_event": { + "forms": [ + { + "href": "https://example.com/spectrometer/intensity/measurement-event", + "subprotocol": "sse", + "op": "subscribeevent", + "htv:methodName": "GET", + "contentType": "text/event-stream" + } + ] +} + +``` + +Although the code is the very familiar & age-old RPC server style, one can directly specify HTTP methods and URL path for each property, action and event. A configurable HTTP Server is already available (from `hololinked.server.HTTPServer`) which redirects HTTP requests to the object according to the specified HTTP API on the properties, actions and events. To plug in a HTTP server: + +```python +import ssl, os, logging +from multiprocessing import Process +from hololinked.server import HTTPServer + +def start_https_server(): + # You need to create a certificate on your own + ssl_context = ssl.SSLContext(protocol=ssl.PROTOCOL_TLS) + ssl_context.load_cert_chain(f'assets{os.sep}security{os.sep}certificate.pem', + keyfile = f'assets{os.sep}security{os.sep}key.pem') + + HTTPServer(['spectrometer'], port=8083, ssl_context=ssl_context, + log_level=logging.DEBUG).listen() + + +if __name__ == "__main__": + + Process(target=start_https_server).start() + + OceanOpticsSpectrometer( + instance_name='spectrometer', + serial_number='S14155', + log_level=logging.DEBUG + ).run() + +``` +Here one can see the use of `instance_name` and why it turns up in the URL path. + +The intention behind specifying HTTP URL paths and methods directly on object's members is to +- eliminate the need to implement a detailed HTTP server (& its API) which generally poses problems in queueing commands issued to instruments +- or, write an additional boiler-plate HTTP to RPC bridge +- or, find a reasonable HTTP-RPC implementation which supports all three of properties, actions and events, yet appeals deeply to the object oriented python world. + +See a list of currently supported features [below](#currently-supported).

+Ultimately, as expected, the redirection from the HTTP side to the object is mediated by ZeroMQ which implements the fully fledged RPC server that queues all the HTTP requests to execute them one-by-one on the hardware/object. The HTTP server can also communicate with the RPC server over ZeroMQ's INPROC (for the non-expert = multithreaded applications, at least in python) or IPC (for the non-expert = multiprocess applications) transport methods. In the example above, IPC is used by default. There is no need for yet another TCP from HTTP to TCP to ZeroMQ transport athough this is also supported.

+Serialization-Deserialization overheads are also already reduced. For example, when pushing an event from the object which gets automatically tunneled as a HTTP SSE or returning a reply for an action from the object, there is no JSON deserialization-serialization overhead when the message passes through the HTTP server. The message is serialized once on the object side but passes transparently through the HTTP server. + +One may use the HTTP API according to one's beliefs (including letting the package auto-generate it), although it is mainly intended for web development and cross platform clients like the interoperable [node-wot](https://github.com/eclipse-thingweb/node-wot) client. The node-wot client is the recommended Javascript client for this package as one can seamlessly plugin code developed from this package to the rest of the IoT tools, protocols & standardizations, or do scripting on the browser or nodeJS. Please check node-wot docs on how to consume [Thing Descriptions](https://www.w3.org/TR/wot-thing-description11) to call actions, read & write properties or subscribe to events. A Thing Description will be automatically generated if absent as shown in JSON examples above or can be supplied manually. +To know more about client side scripting, please look into the documentation [How-To](https://hololinked.readthedocs.io/en/latest/howto/index.html) section. + +##### NOTE - The package is under active development. Contributors welcome. + +- [example repository](https://github.com/VigneshVSV/hololinked-examples) - detailed examples for both clients and servers +- [helper GUI](https://github.com/VigneshVSV/hololinked-portal) - view & interact with your object's methods, properties and events. + +### Currently Supported + +- indicate HTTP verb & URL path directly on object's methods, properties and events. +- control method execution and property write with a custom finite state machine. +- database (Postgres, MySQL, SQLite - based on SQLAlchemy) support for storing and loading properties when object dies and restarts. +- auto-generate Thing Description for Web of Things applications (inaccurate, continuously developed but usable). +- use serializer of your choice (except for HTTP) - MessagePack, JSON, pickle etc. & extend serialization to suit your requirement. HTTP Server will support only JSON serializer to maintain compatibility with node-wot. Default is JSON serializer based on msgspec. +- asyncio compatible - async RPC server event-loop and async HTTP Server - write methods in async +- have flexibility in process architecture - run HTTP Server & python object in separate processes or in the same process, serve multiple objects with same HTTP server etc. +- choose from multiple ZeroMQ transport methods. Again, please check examples or the code for explanations. Documentation is being activety improved. -### To Install +### Currently being worked -clone the repository and install in develop mode `pip install -e .` for convenience. The conda env hololinked.yml can also help. -There is no release to pip available right now. +- improving accuracy of Thing Descriptions +- observable properties and read/write multiple properties through node-wot client +- argument schema validation for actions (you can specify one but its not used) +- credentials for authentication -### In Development +### Some Day In Future -- Object Proxy (Plain TCP Client- i.e. the RPC part is not fully developed yet) -- Multiple ZMQ Protocols +- mongo DB support for DB operations - HTTP 2.0 -- Database support for storing and loading parameters (based on SQLAlchemy) when object dies and restarts ### Contact -Contributors welcome for all my projects related to hololinked including web apps. Please write to my contact email available at [website](https://hololinked.dev). +Contributors welcome for all my projects related to hololinked including web apps. Please write to my contact email available at my [website](https://hololinked.dev). The contributing file is currently only boilerplate and need not be adhered. + + + + + + + diff --git a/doc/Makefile b/doc/Makefile index 722ff77..54c86eb 100644 --- a/doc/Makefile +++ b/doc/Makefile @@ -20,9 +20,6 @@ clean: del * /s /f /q cd .. -cleanbuild: - cd .. - .PHONY: help Makefile # Catch-all target: route all unknown targets to Sphinx using the new diff --git a/doc/make.bat b/doc/make.bat index dc1312a..2949e3b 100644 --- a/doc/make.bat +++ b/doc/make.bat @@ -9,6 +9,7 @@ if "%SPHINXBUILD%" == "" ( ) set SOURCEDIR=source set BUILDDIR=build +set DOC_ADDRESS=http://localhost:8000 %SPHINXBUILD% >NUL 2>NUL if errorlevel 9009 ( @@ -25,11 +26,33 @@ if errorlevel 9009 ( if "%1" == "" goto help +if "%1" == "server" goto server + +if "%1" == "open-in-chrome" goto open-in-chrome + +if "%1" == "host-doc" goto host-doc + %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% goto end :help %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% +goto end + +:server +echo server is hosted at %DOC_ADDRESS%, change port directory in make file if necessary +python -m http.server --directory build\html +goto end + +:open-doc-in-browser +start explorer %DOC_ADDRESS% +goto end + +:host-doc +echo server is hosted at %DOC_ADDRESS%, change port directory in make file if necessary +start explorer %DOC_ADDRESS% +python -m http.server --directory build\html +goto end :end popd diff --git a/doc/source/_static/architecture.drawio.dark.svg b/doc/source/_static/architecture.drawio.dark.svg new file mode 100644 index 0000000..f198eb8 --- /dev/null +++ b/doc/source/_static/architecture.drawio.dark.svg @@ -0,0 +1,4 @@ + + + +

HTTP Clients
(Web browsers)

HTTP Clients...

Local Desktop

Clients & Scripts

(RPC Clients)

Local Desktop...

Clients & Scripts 

on the network

(RPC Clients)

Clients & Scripts...
HTTP Server
HTTP Server
INPROC Server
INPROC Server
IPC Server
IPC Server

TCP Server

TCP Server
RPC Server
RPC Server
INPROC Server
Eventloop and remote object executor
INPROC Server...
Common INPROC Client
Common INPR...
Recommended Communication
Recommende...
Not Recommended Communication
Not Recommended Com...
Developer should deicde
Developer should...
\ No newline at end of file diff --git a/doc/source/_static/architecture.drawio.light.svg b/doc/source/_static/architecture.drawio.light.svg new file mode 100644 index 0000000..4b0b2ed --- /dev/null +++ b/doc/source/_static/architecture.drawio.light.svg @@ -0,0 +1,4 @@ + + + +

HTTP Clients
(Web browsers)

HTTP Clients...

Local Desktop

Clients & Scripts

(RPC Clients)

Local Desktop...

Clients & Scripts 

on the network

(RPC Clients)

Clients & Scripts...
HTTP Server
HTTP Server
INPROC Server
INPROC Server
IPC Server
IPC Server

TCP Server

TCP Server
RPC Server
RPC Server
INPROC Server
Eventloop and remote object executor
INPROC Server...
Common INPROC Client
Common INPR...
Recommended Communication
Recommende...
Not Recommended Communication
Not Recommended Com...
Developer should deicde
Developer should...
\ No newline at end of file diff --git a/doc/source/_static/type-definitions-withoutIDL.png b/doc/source/_static/type-definitions-withoutIDL.png new file mode 100644 index 0000000..f0c1e1b Binary files /dev/null and b/doc/source/_static/type-definitions-withoutIDL.png differ diff --git a/doc/source/autodoc/client/index.rst b/doc/source/autodoc/client/index.rst new file mode 100644 index 0000000..225cb94 --- /dev/null +++ b/doc/source/autodoc/client/index.rst @@ -0,0 +1,6 @@ +``ObjectProxy`` +=============== + +.. autoclass:: hololinked.client.ObjectProxy + :members: + :show-inheritance: diff --git a/doc/source/autodoc/index.rst b/doc/source/autodoc/index.rst index 7b97343..f96f0db 100644 --- a/doc/source/autodoc/index.rst +++ b/doc/source/autodoc/index.rst @@ -1,10 +1,50 @@ .. _apiref: +.. |br| raw:: html + +
+ API Reference ============= +``hololinked.server`` +--------------------- + .. toctree:: - :maxdepth: 2 + :maxdepth: 1 + + server/thing/index + server/properties/index + server/action + server/events + server/http_server/index + server/serializers + server/database/index + server/eventloop + server/zmq_message_brokers/index + server/configuration + server/dataclasses + server/enumerations + + + +``hololinked.client`` +--------------------- + +.. toctree:: + :maxdepth: 1 + + client/index + + +.. ``hololinked.system_host`` +.. -------------------------- + +.. .. toctree:: +.. :maxdepth: 1 + +.. server/system_host/index + + + - server/server - diff --git a/doc/source/autodoc/server/action.rst b/doc/source/autodoc/server/action.rst new file mode 100644 index 0000000..0bc9acb --- /dev/null +++ b/doc/source/autodoc/server/action.rst @@ -0,0 +1,4 @@ +actions +======= + +.. autofunction:: hololinked.server.action.action \ No newline at end of file diff --git a/doc/source/autodoc/server/configuration.rst b/doc/source/autodoc/server/configuration.rst new file mode 100644 index 0000000..b4e086c --- /dev/null +++ b/doc/source/autodoc/server/configuration.rst @@ -0,0 +1,6 @@ +``Configuration`` +================= + +.. autoclass:: hololinked.server.config.Configuration + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/data_classes.rst b/doc/source/autodoc/server/data_classes.rst deleted file mode 100644 index 602cf0a..0000000 --- a/doc/source/autodoc/server/data_classes.rst +++ /dev/null @@ -1,6 +0,0 @@ -hololinked.server.data_classes --------------------------- - -.. automodule:: hololinked.server.data_classes - :members: - :show-inheritance: diff --git a/doc/source/autodoc/server/database/baseDB.rst b/doc/source/autodoc/server/database/baseDB.rst new file mode 100644 index 0000000..c8586fb --- /dev/null +++ b/doc/source/autodoc/server/database/baseDB.rst @@ -0,0 +1,15 @@ +BaseDB +------ + +.. autoclass:: hololinked.server.database.BaseDB + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.database.BaseSyncDB + :members: + :show-inheritance: + +.. .. autoclass:: hololinked.server.database.BaseAsyncDB +.. :members: +.. :show-inheritance: + diff --git a/doc/source/autodoc/server/database/helpers.rst b/doc/source/autodoc/server/database/helpers.rst new file mode 100644 index 0000000..a03caf0 --- /dev/null +++ b/doc/source/autodoc/server/database/helpers.rst @@ -0,0 +1,6 @@ +helpers +------- + +.. autoclass:: hololinked.server.database.batch_db_commit + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/database/index.rst b/doc/source/autodoc/server/database/index.rst new file mode 100644 index 0000000..9eaeca7 --- /dev/null +++ b/doc/source/autodoc/server/database/index.rst @@ -0,0 +1,13 @@ +database +======== + +.. autoclass:: hololinked.server.database.ThingDB + :members: + :show-inheritance: + +.. toctree:: + :hidden: + :maxdepth: 1 + + helpers + baseDB \ No newline at end of file diff --git a/doc/source/autodoc/server/dataclasses.rst b/doc/source/autodoc/server/dataclasses.rst new file mode 100644 index 0000000..67157f6 --- /dev/null +++ b/doc/source/autodoc/server/dataclasses.rst @@ -0,0 +1,42 @@ +.. _apiref: + +.. |br| raw:: html + +
+ + +data classes +============ + +The following is a list of all dataclasses used to store information on the exposed +resources on the network. These classese are generally not for consumption by the package-end-user. + + +.. autoclass:: hololinked.server.data_classes.RemoteResourceInfoValidator + :members: to_dataclass + :show-inheritance: + +|br| + +.. autoclass:: hololinked.server.data_classes.RemoteResource + :members: + :show-inheritance: + +|br| + +.. autoclass:: hololinked.server.data_classes.HTTPResource + :members: + :show-inheritance: + +|br| + +.. autoclass:: hololinked.server.data_classes.RPCResource + :members: + :show-inheritance: + +|br| + +.. autoclass:: hololinked.server.data_classes.ServerSentEvent + :members: + :show-inheritance: + diff --git a/doc/source/autodoc/server/decorators.rst b/doc/source/autodoc/server/decorators.rst deleted file mode 100644 index 0b6d33c..0000000 --- a/doc/source/autodoc/server/decorators.rst +++ /dev/null @@ -1,6 +0,0 @@ -hololinked.server.decorators ------------------------ - -.. automodule:: hololinked.server.decorators - :members: - :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/enumerations.rst b/doc/source/autodoc/server/enumerations.rst new file mode 100644 index 0000000..960664f --- /dev/null +++ b/doc/source/autodoc/server/enumerations.rst @@ -0,0 +1,12 @@ +enumerations +------------ + +.. autoclass:: hololinked.server.constants.HTTP_METHODS() + +.. autoclass:: hololinked.server.constants.Serializers() + +.. autoclass:: hololinked.server.constants.ZMQ_PROTOCOLS() + + + + diff --git a/doc/source/autodoc/server/eventloop.rst b/doc/source/autodoc/server/eventloop.rst new file mode 100644 index 0000000..88a5023 --- /dev/null +++ b/doc/source/autodoc/server/eventloop.rst @@ -0,0 +1,14 @@ +``EventLoop`` +============= + +.. autoclass:: hololinked.server.eventloop.EventLoop() + :members: instance_name, things + :show-inheritance: + +.. automethod:: hololinked.server.eventloop.EventLoop.__init__ + +.. automethod:: hololinked.server.eventloop.EventLoop.run + +.. automethod:: hololinked.server.eventloop.EventLoop.exit + +.. automethod:: hololinked.server.eventloop.EventLoop.get_async_loop \ No newline at end of file diff --git a/doc/source/autodoc/server/events.rst b/doc/source/autodoc/server/events.rst new file mode 100644 index 0000000..0a013c7 --- /dev/null +++ b/doc/source/autodoc/server/events.rst @@ -0,0 +1,6 @@ +events +====== + +.. autoclass:: hololinked.server.events.Event + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/http_server/base_handler.rst b/doc/source/autodoc/server/http_server/base_handler.rst new file mode 100644 index 0000000..022c501 --- /dev/null +++ b/doc/source/autodoc/server/http_server/base_handler.rst @@ -0,0 +1,6 @@ +``BaseHandler`` +=============== + +.. autoclass:: hololinked.server.handlers.BaseHandler + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/http_server/event_handler.rst b/doc/source/autodoc/server/http_server/event_handler.rst new file mode 100644 index 0000000..259eb9b --- /dev/null +++ b/doc/source/autodoc/server/http_server/event_handler.rst @@ -0,0 +1,6 @@ +``EventHandler`` +================ + +.. autoclass:: hololinked.server.handlers.EventHandler + :members: + :show-inheritance: diff --git a/doc/source/autodoc/server/http_server/index.rst b/doc/source/autodoc/server/http_server/index.rst new file mode 100644 index 0000000..12016d3 --- /dev/null +++ b/doc/source/autodoc/server/http_server/index.rst @@ -0,0 +1,15 @@ +``HTTPServer`` +============== + +.. autoclass:: hololinked.server.HTTPServer.HTTPServer + :members: + :show-inheritance: + + +.. toctree:: + :maxdepth: 1 + :hidden: + + rpc_handler + event_handler + base_handler \ No newline at end of file diff --git a/doc/source/autodoc/server/http_server/rpc_handler.rst b/doc/source/autodoc/server/http_server/rpc_handler.rst new file mode 100644 index 0000000..376e630 --- /dev/null +++ b/doc/source/autodoc/server/http_server/rpc_handler.rst @@ -0,0 +1,6 @@ +``RPCHandler`` +============== + +.. autoclass:: hololinked.server.handlers.RPCHandler + :members: + :show-inheritance: diff --git a/doc/source/autodoc/server/properties/helpers.rst b/doc/source/autodoc/server/properties/helpers.rst new file mode 100644 index 0000000..d3c7b82 --- /dev/null +++ b/doc/source/autodoc/server/properties/helpers.rst @@ -0,0 +1,4 @@ +helpers +======= + +.. autofunction:: hololinked.param.parameterized.depends_on \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/index.rst b/doc/source/autodoc/server/properties/index.rst new file mode 100644 index 0000000..3068ed0 --- /dev/null +++ b/doc/source/autodoc/server/properties/index.rst @@ -0,0 +1,30 @@ +properties +========== + +.. toctree:: + :hidden: + :maxdepth: 1 + + types/index + parameterized + helpers + +.. autoclass:: hololinked.server.property.Property() + :members: + :show-inheritance: + +.. automethod:: hololinked.server.property.Property.validate_and_adapt + +.. automethod:: hololinked.server.property.Property.getter + +.. automethod:: hololinked.server.property.Property.setter + +.. automethod:: hololinked.server.property.Property.deleter + + +A few notes: + +* The default value of ``Property`` (first argument to constructor) is owned by the Property instance and not the object where the property is attached. +* The value of a constant can still be changed in code by temporarily overriding the value of this attribute or ``edit_constant`` context manager. + + diff --git a/doc/source/autodoc/server/properties/parameterized.rst b/doc/source/autodoc/server/properties/parameterized.rst new file mode 100644 index 0000000..a722c8c --- /dev/null +++ b/doc/source/autodoc/server/properties/parameterized.rst @@ -0,0 +1,6 @@ +``Parameterized`` +================= + +.. autoclass:: hololinked.param.parameterized.Parameterized + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/boolean.rst b/doc/source/autodoc/server/properties/types/boolean.rst new file mode 100644 index 0000000..afd4c51 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/boolean.rst @@ -0,0 +1,6 @@ +``Boolean`` +=========== + +.. autoclass:: hololinked.server.properties.Boolean + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/class_selector.rst b/doc/source/autodoc/server/properties/types/class_selector.rst new file mode 100644 index 0000000..d5cf60f --- /dev/null +++ b/doc/source/autodoc/server/properties/types/class_selector.rst @@ -0,0 +1,7 @@ +``ClassSelector`` +================= + + +.. autoclass:: hololinked.server.properties.ClassSelector + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/file_system.rst b/doc/source/autodoc/server/properties/types/file_system.rst new file mode 100644 index 0000000..8b1b325 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/file_system.rst @@ -0,0 +1,18 @@ +``Filename``, ``Foldername``, ``Path``, ``FileSelector`` +======================================================== + +.. autoclass:: hololinked.server.properties.Filename + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.Foldername + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.Path + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.FileSelector + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/index.rst b/doc/source/autodoc/server/properties/types/index.rst new file mode 100644 index 0000000..76e00a3 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/index.rst @@ -0,0 +1,18 @@ +builtin-types +------------- + +These are the predefined parameter types. Others can be custom-defined by inheriting from ``Property`` base class +or any of the following classes. + +.. toctree:: + :maxdepth: 1 + + string + number + boolean + iterables + selector + class_selector + file_system + typed_list + typed_dict \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/iterables.rst b/doc/source/autodoc/server/properties/types/iterables.rst new file mode 100644 index 0000000..6a153e5 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/iterables.rst @@ -0,0 +1,10 @@ +``List``, ``Tuple`` +=================== + +.. autoclass:: hololinked.server.properties.List + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.Tuple + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/number.rst b/doc/source/autodoc/server/properties/types/number.rst new file mode 100644 index 0000000..5b27f88 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/number.rst @@ -0,0 +1,10 @@ +``Number``, ``Integer`` +======================= + +.. autoclass:: hololinked.server.properties.Number + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.Integer + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/selector.rst b/doc/source/autodoc/server/properties/types/selector.rst new file mode 100644 index 0000000..f1636c3 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/selector.rst @@ -0,0 +1,10 @@ +``Selector``, ``TupleSelector`` +=============================== + +.. autoclass:: hololinked.server.properties.Selector + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.properties.TupleSelector + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/string.rst b/doc/source/autodoc/server/properties/types/string.rst new file mode 100644 index 0000000..3337118 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/string.rst @@ -0,0 +1,6 @@ +``String`` +========== + +.. autoclass:: hololinked.server.properties.String + :members: + :show-inheritance: diff --git a/doc/source/autodoc/server/properties/types/typed_dict.rst b/doc/source/autodoc/server/properties/types/typed_dict.rst new file mode 100644 index 0000000..bdfd288 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/typed_dict.rst @@ -0,0 +1,6 @@ +``TypedDict`` +============= + +.. autoclass:: hololinked.server.properties.TypedDict + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/properties/types/typed_list.rst b/doc/source/autodoc/server/properties/types/typed_list.rst new file mode 100644 index 0000000..b808ff6 --- /dev/null +++ b/doc/source/autodoc/server/properties/types/typed_list.rst @@ -0,0 +1,6 @@ +``TypedList`` +============= + +.. autoclass:: hololinked.server.properties.TypedList + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/remote_object.rst b/doc/source/autodoc/server/remote_object.rst deleted file mode 100644 index 3c43d80..0000000 --- a/doc/source/autodoc/server/remote_object.rst +++ /dev/null @@ -1,6 +0,0 @@ -hololinked.server.remote_object --------------------------- - -.. automodule:: hololinked.server.remote_object - :members: - :show-inheritance: diff --git a/doc/source/autodoc/server/schema.rst b/doc/source/autodoc/server/schema.rst new file mode 100644 index 0000000..76d67eb --- /dev/null +++ b/doc/source/autodoc/server/schema.rst @@ -0,0 +1,41 @@ +Thing Description Schema +------------------------ + +.. autoclass:: hololinked.server.td.ThingDescription + :members: + :show-inheritance: + + +.. collapse:: Schema + + .. autoclass:: hololinked.server.td.Schema + :members: + :show-inheritance: + + +.. collapse:: InteractionAffordance + + .. autoclass:: hololinked.server.td.InteractionAffordance + :members: + :show-inheritance: + + +.. collapse:: PropertyAffordance + + .. autoclass:: hololinked.server.td.PropertyAffordance + :members: + :show-inheritance: + + +.. collapse:: ActionAffordance + + .. autoclass:: hololinked.server.td.ActionAffordance + :members: + :show-inheritance: + + +.. collapse:: EventAffordance + + .. autoclass:: hololinked.server.td.EventAffordance + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/serializers.rst b/doc/source/autodoc/server/serializers.rst new file mode 100644 index 0000000..255c379 --- /dev/null +++ b/doc/source/autodoc/server/serializers.rst @@ -0,0 +1,36 @@ +serializers +----------- + +.. autoclass:: hololinked.server.serializers.BaseSerializer + :members: + :show-inheritance: + +.. collapse:: JSONSerializer + + .. autoclass:: hololinked.server.serializers.JSONSerializer + :members: + :show-inheritance: + +.. collapse:: MsgpackSerializer + + .. autoclass:: hololinked.server.serializers.MsgpackSerializer + :members: + :show-inheritance: + +.. collapse:: SerpentSerializer + + .. autoclass:: hololinked.server.serializers.SerpentSerializer + :members: + :show-inheritance: + +.. collapse:: PickleSerializer + + .. autoclass:: hololinked.server.serializers.PickleSerializer + :members: + :show-inheritance: + +.. collapse:: PythonBuiltinJSONSerializer + + .. autoclass:: hololinked.server.serializers.PythonBuiltinJSONSerializer + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/server.rst b/doc/source/autodoc/server/server.rst deleted file mode 100644 index 785a6b7..0000000 --- a/doc/source/autodoc/server/server.rst +++ /dev/null @@ -1,8 +0,0 @@ -hololinked.server -============ - -.. toctree:: - - remote_object - decorators - data_classes diff --git a/doc/source/autodoc/server/system_host/index.rst b/doc/source/autodoc/server/system_host/index.rst new file mode 100644 index 0000000..e651655 --- /dev/null +++ b/doc/source/autodoc/server/system_host/index.rst @@ -0,0 +1,9 @@ +SystemHost +========== + +.. autofunction:: hololinked.system_host.create_system_host + +.. autoclass:: hololinked.system_host.handlers.SystemHostHandler + :members: + :show-inheritance: + diff --git a/doc/source/autodoc/server/thing/index.rst b/doc/source/autodoc/server/thing/index.rst new file mode 100644 index 0000000..d48a0c3 --- /dev/null +++ b/doc/source/autodoc/server/thing/index.rst @@ -0,0 +1,41 @@ +``Thing`` +========= + +.. autoclass:: hololinked.server.thing.Thing() + :members: instance_name, logger, state, rpc_serializer, json_serializer, + event_publisher, + :show-inheritance: + +.. automethod:: hololinked.server.thing.Thing.__init__ + +.. attribute:: Thing.logger_remote_access + :type: Optional[bool] + + set False to prevent access of logs of logger remotely + +.. attribute:: Thing.state_machine + :type: Optional[hololinked.server.state_machine.StateMachine] + + initialize state machine for controlling method/action execution and property writes + +.. attribute:: Thing.use_default_db + :type: Optional[bool] + + set True to create a default SQLite database. Mainly used for storing properties and autoloading them when the object + dies and restarts. + +.. automethod:: hololinked.server.thing.Thing.get_thing_description + +.. automethod:: hololinked.server.thing.Thing.run_with_http_server + +.. automethod:: hololinked.server.thing.Thing.run + +.. automethod:: hololinked.server.thing.Thing.exit + +.. toctree:: + :maxdepth: 1 + :hidden: + + state_machine + network_handler + thing_meta \ No newline at end of file diff --git a/doc/source/autodoc/server/thing/network_handler.rst b/doc/source/autodoc/server/thing/network_handler.rst new file mode 100644 index 0000000..22297fb --- /dev/null +++ b/doc/source/autodoc/server/thing/network_handler.rst @@ -0,0 +1,27 @@ +``RemoteAccessHandler`` +======================= + +.. autoclass:: hololinked.server.logger.RemoteAccessHandler() + :show-inheritance: + +.. automethod:: hololinked.server.logger.RemoteAccessHandler.__init__ + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.stream_interval + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.debug_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.warn_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.info_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.error_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.critical_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.execution_logs + +.. autoattribute:: hololinked.server.logger.RemoteAccessHandler.maxlen + +.. automethod:: hololinked.server.logger.RemoteAccessHandler.push_events + +.. automethod:: hololinked.server.logger.RemoteAccessHandler.stop_events \ No newline at end of file diff --git a/doc/source/autodoc/server/thing/state_machine.rst b/doc/source/autodoc/server/thing/state_machine.rst new file mode 100644 index 0000000..88d17a9 --- /dev/null +++ b/doc/source/autodoc/server/thing/state_machine.rst @@ -0,0 +1,19 @@ +``StateMachine`` +================ + +.. autoclass:: hololinked.server.state_machine.StateMachine() + :members: valid, on_enter, on_exit, states, initial_state, machine, current_state + :show-inheritance: + +.. automethod:: hololinked.server.state_machine.StateMachine.__init__ + +.. automethod:: hololinked.server.state_machine.StateMachine.set_state + +.. automethod:: hololinked.server.state_machine.StateMachine.get_state + +.. automethod:: hololinked.server.state_machine.StateMachine.has_object + +.. note:: + The condition whether to execute a certain method or parameter write in a certain + state is checked by the ``EventLoop`` class and not this class. This class only provides + the information and handles set state logic. diff --git a/doc/source/autodoc/server/thing/thing_meta.rst b/doc/source/autodoc/server/thing/thing_meta.rst new file mode 100644 index 0000000..effddd9 --- /dev/null +++ b/doc/source/autodoc/server/thing/thing_meta.rst @@ -0,0 +1,7 @@ +``ThingMeta`` +============= + +.. autoclass:: hololinked.server.thing.ThingMeta() + :members: properties + :show-inheritance: + diff --git a/doc/source/autodoc/server/zmq_message_brokers/base_zmq.rst b/doc/source/autodoc/server/zmq_message_brokers/base_zmq.rst new file mode 100644 index 0000000..2656d27 --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/base_zmq.rst @@ -0,0 +1,31 @@ +.. |br| raw:: html + +
+ + +Base ZMQ +======== + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseZMQ + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseSyncZMQ + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseAsyncZMQ + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseZMQServer + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseAsyncZMQServer + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.BaseZMQClient + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/zmq_message_brokers/event.rst b/doc/source/autodoc/server/zmq_message_brokers/event.rst new file mode 100644 index 0000000..fcdf71d --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/event.rst @@ -0,0 +1,14 @@ +Event Handlers +============== + +.. autoclass:: hololinked.server.zmq_message_brokers.EventPublisher + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.EventConsumer + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.AsyncEventConsumer + :members: + :show-inheritance: \ No newline at end of file diff --git a/doc/source/autodoc/server/zmq_message_brokers/index.rst b/doc/source/autodoc/server/zmq_message_brokers/index.rst new file mode 100644 index 0000000..1e01337 --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/index.rst @@ -0,0 +1,33 @@ +ZMQ message brokers +=================== + +``hololinked`` uses ZMQ under the hood to implement a RPC server. All requests, either coming through a HTTP +Server (from a HTTP client/web browser) or an RPC client are routed via the RPC Server to queue them before execution. + +Since a RPC client is available in ``hololinked`` (or will be made available), it is suggested to use the HTTP +server for web development practices (like REST-similar endpoints) and not for RPC purposes. The following picture +summarizes how messages are routed to the ``RemoteObject``. + +.. image:: ../../../_static/architecture.drawio.light.svg + :class: only-light + +.. image:: ../../../_static/architecture.drawio.dark.svg + :class: only-dark + + +The message brokers are divided to client and server types. Servers recieve a message before replying & clients +initiate message requests. + +See documentation of ``RPCServer`` for details. + +.. toctree:: + :maxdepth: 1 + + base_zmq + zmq_server + rpc_server + zmq_client + event + + + diff --git a/doc/source/autodoc/server/zmq_message_brokers/rpc_server.rst b/doc/source/autodoc/server/zmq_message_brokers/rpc_server.rst new file mode 100644 index 0000000..23fda29 --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/rpc_server.rst @@ -0,0 +1,13 @@ +.. |br| raw:: html + +
+ + +RPC Server +========== + + +.. autoclass:: hololinked.server.zmq_message_brokers.RPCServer + :members: + :show-inheritance: + diff --git a/doc/source/autodoc/server/zmq_message_brokers/zmq_client.rst b/doc/source/autodoc/server/zmq_message_brokers/zmq_client.rst new file mode 100644 index 0000000..0cf83b2 --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/zmq_client.rst @@ -0,0 +1,16 @@ +.. |br| raw:: html + +
+ + +ZMQ Clients +=========== + +.. autoclass:: hololinked.server.zmq_message_brokers.SyncZMQClient + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.MessageMappedZMQClientPool + :members: + :show-inheritance: + diff --git a/doc/source/autodoc/server/zmq_message_brokers/zmq_server.rst b/doc/source/autodoc/server/zmq_message_brokers/zmq_server.rst new file mode 100644 index 0000000..3390224 --- /dev/null +++ b/doc/source/autodoc/server/zmq_message_brokers/zmq_server.rst @@ -0,0 +1,19 @@ +.. |br| raw:: html + +
+ + +ZMQ Servers +=========== + +.. autoclass:: hololinked.server.zmq_message_brokers.AsyncZMQServer + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.AsyncPollingZMQServer + :members: + :show-inheritance: + +.. autoclass:: hololinked.server.zmq_message_brokers.ZMQServerPool + :members: + :show-inheritance: diff --git a/doc/source/conf.py b/doc/source/conf.py index ac3a32b..e21bb15 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -10,19 +10,19 @@ # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # -# import os -# import sys -# sys.path.insert(0, os.path.abspath('.')) +import os +import sys +sys.path.insert(0, os.path.abspath(f'..{os.sep}..')) # -- Project information ----------------------------------------------------- project = 'hololinked' -copyright = '2023, Vignesh Venkatasubramanian Vaidyanathan' +copyright = '2024, Vignesh Venkatasubramanian Vaidyanathan' author = 'Vignesh Venkatasubramanian Vaidyanathan' # The full version, including alpha/beta/rc tags -release = '0.1' +release = '0.1.2' # -- General configuration --------------------------------------------------- @@ -33,7 +33,9 @@ extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.duration', - 'sphinx_copybutton' + 'sphinx_copybutton', + 'sphinx_toolbox.collapse', + 'numpydoc' ] # Add any paths that contain templates here, relative to this directory. @@ -55,9 +57,24 @@ "**": ["sidebar-nav-bs"] } +html_theme_options = { + "secondary_sidebar_items": { + "**" : ["page-toc", "sourcelink"], + }, + "navigation_with_keys" : True +} + pygments_style = 'vs' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". -html_static_path = ['_static'] \ No newline at end of file +html_static_path = ['_static'] + +numpydoc_show_class_members = False + +autodoc_member_order = 'bysource' + +today_fmt = '%d.%m.%Y %H:%M' + + diff --git a/doc/source/development_notes.rst b/doc/source/development_notes.rst new file mode 100644 index 0000000..74c9514 --- /dev/null +++ b/doc/source/development_notes.rst @@ -0,0 +1,97 @@ +.. |module-highlighted| replace:: ``hololinked`` + +.. |br| raw:: html + +
+ + +.. _note: + +Development Notes +================= + +|module-highlighted| is fundamentally a Object Oriented ZeroMQ RPC with control over the attributes, methods +and events that are exposed on the network. Nevertheless, a non-trivial support for HTTP exists in an attempt to cast +atleast certain aspects of instrumentation control & data-acquisition for web development practices, without having to +explicitly implement a HTTP server. The following is possible with significantly lesser code: + +* |module-highlighted| gives the freedom to choose the HTTP request method & end-point URL desirable for + property/attribute, method and event +* All HTTP requests will be automatically queued and executed serially by the RPC server unless threaded or + made async by the developer +* JSON serialization-deserialization overheads while tunneling HTTP requests through the RPC server + are reduced to a minimum. +* Events pushed by the object will be automatically tunneled as HTTP server sent events + +Further web request handlers may be modified to change headers, authentication etc. or add additional +endpoints which may cast resources to REST-like while leaving the RPC details to the package. One uses exposed object +members as follows: + +* properties can be used to model settings of instrumentation (both hardware and software-only), + general class/instance attributes, hold captured & computed data +* methods can be used to issue commands to instruments like start and stop acquisition, connect/disconnect etc. +* events can be used to push measured data, create alerts/alarms, inform availability of certain type of data etc. + +Verb like URLs may be used for methods (acts like HTTP-RPC although ZeroMQ mediates this) & noun-like URLs may be used +for properties and events. Further, HTTP request methods may be mapped as follows: + +.. list-table:: + :header-rows: 1 + + * - HTTP request verb/method + - property + - action/remote method + - event + * - GET + - read property value |br| (read a setting's value, fetch measured data, physical quantities) + - run method which gives a return value with useful data |br| (which may be difficult or illogical as a ``Property``) + - stream measured data immediately when available instead of fetching every time + * - POST + - add dynamic properties with certain settings |br| (add a dynamic setting or data type etc. for which the logic is already factored in code) + - run python logic, methods that connect/disconnect or issue commands to instruments (RPC) + - not applicable + * - PUT + - write property value |br| (modify a setting and apply it onto the device) + - change value of a resource which is difficult to factor into a property + - not applicable + * - DELETE + - remove a dynamic property |br| (remove a setting or data type for which the logic is already factored into the code) + - developer's interpretation + - not applicable + * - PATCH + - change settings of a property |br| (change the rules of how a setting can be modified and applied, how a measured data can be stored etc.) + - change partial value of a resource which is difficult to factor into a property or change settings of a property with custom logic + - not applicable + +If you dont agree with the table above, use `Thing Description ` +standard instead, which is pretty close. Considering an example device like a spectrometer, the table above may dictate the following: + +.. list-table:: + :header-rows: 1 + + * - HTTP request verb/method + - property + - remote method + - event + * - GET + - get integration time + - get accumulated dictionary of measurement settings + - stream measured spectrum + * - POST + - + - connect, disconnect, start and stop acquisition + - + * - PUT + - set integration time onto device + - + - + * - PATCH + - edit integration time bound values + - + - + + +Further, plain RPC calls directly through object proxy are possible without the details of HTTP. This is directly mediated +by ZeroMQ. + + diff --git a/doc/source/examples/index.rst b/doc/source/examples/index.rst index fc990bf..7f75bad 100644 --- a/doc/source/examples/index.rst +++ b/doc/source/examples/index.rst @@ -4,14 +4,24 @@ Examples Remote Objects -------------- +Beginner's example +__________________ + +These examples are for absolute beginners into the world of data-acquisition + .. toctree:: server/spectrometer/index -The code is hosted at the repository `hololinked-examples `_. Consider also installing +The code is hosted at the repository `hololinked-examples `_. +Consider also installing * a JSON preview tool for your browser like `Chrome JSON Viewer `_. * `hololinked-portal `_ to have an web-interface to interact with RemoteObjects (after you can run your example object) -* hoppscotch or postman +* `hoppscotch `_ or `postman `_ + +Web Development +--------------- -Some browser based client examples based on ReactJS & `react material UI `_ are hosted at `hololinked.dev `_ +Some browser based client examples based on ReactJS are hosted at +`hololinked.dev `_ diff --git a/doc/source/examples/server/spectrometer/index.rst b/doc/source/examples/server/spectrometer/index.rst index 3871373..2533cd6 100644 --- a/doc/source/examples/server/spectrometer/index.rst +++ b/doc/source/examples/server/spectrometer/index.rst @@ -103,6 +103,7 @@ with the prefix of the HTTP Server domain name and object instance name. log_level=logging.DEBUG, ) O.run() + To construct the full `URL_path`, the format is |br| `https://{domain name}/{instance name}/{parameter URL path}`, which gives |br| `https://localhost:8083/spectrometer/ocean-optics/USB2000-plus/serial-number` |br| for the `serial_number`. diff --git a/doc/source/howto/clients.rst b/doc/source/howto/clients.rst new file mode 100644 index 0000000..20b969e --- /dev/null +++ b/doc/source/howto/clients.rst @@ -0,0 +1,112 @@ +.. |br| raw:: html + +
+ +Connecting to Things with Clients +================================= + +When using a HTTP server, it is possible to use any HTTP client including web browser provided clients like ``XMLHttpRequest`` +and ``EventSource`` object. This is the intention of providing HTTP support. However, additional possibilities exist which are noteworthy: + +Using ``hololinked.client`` +--------------------------- + +To use ZMQ transport methods to connect to the server instead of HTTP, one can use an object proxy available in +``hololinked.client``. For certain applications, for example, oscilloscope traces consisting of millions of data points, +or, camera images or video streaming with raw pixel density & no compression, the ZMQ transport may significantly speed +up the data transfer rate. Especially one may use a different serializer like MessagePack instead of JSON. +JSON is the default, and currently the only supported serializer for HTTP applications and is still meant to be used +to interface such data-heavy devices with HTTP clients. Nevertheless, ZMQ transport is simultaneously possible along +with using HTTP. +|br| +To use a ZMQ client from a different python process other than the ``Thing``'s running process, one needs to start the +``Thing`` server using TCP or IPC (inter-process communication) transport methods and **not** with ``run_with_http_server()`` +method (which allows only INPROC/intra-process communication). Use the ``run()`` method instead and specify the desired +ZMQ transport layers: + +.. literalinclude:: code/rpc.py + :language: python + :linenos: + :lines: 1-2, 9-13, 74-81 + +Then, import the ``ObjectProxy`` and specify the ZMQ transport method and ``instance_name`` to connect to the server and +the object it serves: + +.. literalinclude:: code/rpc_client.py + :language: python + :linenos: + :lines: 1-9 + +The exposed properties, actions and events become available on the client. One can use get-set on properties, function +calls on actions and subscribe to events with a callback which is executed once an event arrives: + +.. literalinclude:: code/rpc_client.py + :language: python + :linenos: + :lines: 23-27 + +One would be making such remote procedure calls from a PyQt graphical interface, custom acquisition scripts or +measurement scan routines which may be running in the same or a different computer on the network. Use TCP ZMQ transport +to be accessible from network clients. + +.. literalinclude:: code/rpc.py + :language: python + :linenos: + :lines: 75, 84-87 + +Irrespective of client's request origin, whether TCP, IPC or INPROC, requests are always queued before executing. To repeat: + +* TCP - raw TCP transport facilitated by ZMQ (therefore, without details of HTTP) for clients on the network. You might + need to open your firewall. Currently, neither encryption nor user authorization security is provided, use HTTP if you + need these features. +* IPC - interprocess communication for accessing by other process within the same computer. One can use this instead of + using TCP with firewall or single computer applications. +* INPROC - only clients from the same python process can access the server. + +If one needs type definitions for the client because the client does not know the server to which it is connected, one +can import the server script ``Thing`` and set it as the type of the client as a quick-hack. + +.. literalinclude:: code/rpc_client.py + :language: python + :linenos: + :lines: 15-20 + +Serializer customization is discussed further in :doc:`Serializer How-To `. + +Using ``node-wot`` client +------------------------- + +``node-wot`` is an interoperable Javascript client provided by the `Web of Things Working Group `_. +The purpose of this client is to be able to interact with devices with a web standard compatible JSON specification called +as the "`Thing Description `_", which +allows interoperability irrespective of protocol implementation and application domain. The said JSON specification +describes the device's available properties, actions and events and provides human-readable documentation of the device +within the specification itself, enhancing developer experience. |br| +For example, consider the ``serial_number`` property defined previously, the following JSON schema can describe the property: + +.. literalinclude:: code/node-wot/serial_number.json + :language: JSON + :linenos: + +Similarly, ``connect`` action and ``measurement_event`` event may be described as follows: + +.. literalinclude:: code/node-wot/actions_and_events.json + :language: JSON + :linenos: + +It might be already understandable that from such a JSON specification, it is clear how to interact with the specified property, +action or event. The ``node-wot`` client consumes such a specification to provide these interactions for the developer. +``node-wot`` already has protocol bindings like HTTP, CoAP, Modbus, MQTT etc. which can be used in nodeJS or in web browsers. +Since ``hololinked`` offers the possibility of HTTP bindings for devices, such a JSON Thing description is auto generated by +the class to be able to use by the node-wot client. To use the node-wot client on the browser: + +.. literalinclude:: code/node-wot/intro.js + :language: javascript + :linenos: + +There are few reasons one might consider to use ``node-wot`` compared to traditional HTTP client, first and foremost being +standardisation across different protocols. Irrespective of hardware protocol support, including HTTP bindings from +``hololinked``, one can use the same API. For example, one can directly issue modbus calls to a modbus device while +issuing HTTP calls to ``hololinked`` ``Thing``s. Further, node-wot offers validation of property types, action payloads and +return values, and event data without additional programming effort. + diff --git a/doc/source/howto/code/4.py b/doc/source/howto/code/4.py new file mode 100644 index 0000000..d4778fc --- /dev/null +++ b/doc/source/howto/code/4.py @@ -0,0 +1,57 @@ +from hololinked.server import RemoteObject, remote_method, HTTPServer, Event +from hololinked.server.remote_parameters import String, ClassSelector +from seabreeze.spectrometers import Spectrometer +import numpy + + +class OceanOpticsSpectrometer(RemoteObject): + """ + Spectrometer example object + """ + + serial_number = String(default=None, allow_None=True, constant=True, + URL_path="/serial-number", + doc="serial number of the spectrometer") + + model = String(default=None, URL_path='/model', allow_None=True, + doc="model of the connected spectrometer") + + def __init__(self, instance_name, serial_number, connect, **kwargs): + super().__init__(instance_name=instance_name, **kwargs) + self.serial_number = serial_number + if connect and self.serial_number is not None: + self.connect() + self.measurement_event = Event(name='intensity-measurement', + URL_path='/intensity/measurement-event') + + @remote_method(URL_path='/connect', http_method="POST") + def connect(self, trigger_mode = None, integration_time = None): + self.device = Spectrometer.from_serial_number(self.serial_number) + self.model = self.device.model + self.logger.debug(f"opened device with serial number \ + {self.serial_number} with model {self.model}") + if trigger_mode is not None: + self.trigger_mode = trigger_mode + if integration_time is not None: + self.integration_time = integration_time + + intensity = ClassSelector(class_=(numpy.ndarray, list), default=[], + doc="captured intensity", readonly=True, + URL_path='/intensity', fget=lambda self: self._intensity) + + def capture(self): + self._run = True + while self._run: + self._intensity = self.device.intensities( + correct_dark_counts=True, + correct_nonlinearity=True + ) + self.measurement_event.push(self._intensity.tolist()) + + +if __name__ == '__main__': + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number='USB2+H15897', connect=True) + spectrometer.run( + http_server=HTTPServer(port=8080) + ) diff --git a/doc/source/howto/code/eventloop/import.py b/doc/source/howto/code/eventloop/import.py new file mode 100644 index 0000000..7054890 --- /dev/null +++ b/doc/source/howto/code/eventloop/import.py @@ -0,0 +1,8 @@ +from hololinked.client import ObjectProxy +from hololinked.server import EventLoop + +eventloop_proxy = ObjectProxy(instance_name='eventloop', protocol="TCP", + socket_address="tcp://192.168.0.10:60000") #type: EventLoop +obj_id = eventloop_proxy.import_remote_object(file_name=r"D:\path\to\file\IDSCamera", + object_name="UEyeCamera") +eventloop_proxy.instantiate(obj_id, instance_name='camera', device_id=3) \ No newline at end of file diff --git a/doc/source/howto/code/eventloop/list_of_devices.py b/doc/source/howto/code/eventloop/list_of_devices.py new file mode 100644 index 0000000..c761cb1 --- /dev/null +++ b/doc/source/howto/code/eventloop/list_of_devices.py @@ -0,0 +1,23 @@ +from oceanoptics_spectrometer import OceanOpticsSpectrometer +from IDSCamera import UEyeCamera +from hololinked.server import EventLoop, HTTPServer +from multiprocessing import Process + + +def start_http_server(): + server = HTTPServer(remote_objects=['spectrometer', 'eventloop', 'camera']) + server.start() + +if __name__ == '__main__()': + + Process(target=start_http_server).start() + + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number='USB2+H15897') + + camera = UEyeCamera(instance_name='camera', camera_id=3) + + EventLoop( + [spectrometer, camera], + instance_name='eventloop', + ).run() \ No newline at end of file diff --git a/doc/source/howto/code/eventloop/run_eq.py b/doc/source/howto/code/eventloop/run_eq.py new file mode 100644 index 0000000..9c59938 --- /dev/null +++ b/doc/source/howto/code/eventloop/run_eq.py @@ -0,0 +1,19 @@ +from oceanoptics_spectrometer import OceanOpticsSpectrometer +from IDSCamera import UEyeCamera +from hololinked.server import EventLoop, HTTPServer +from multiprocessing import Process + + +def start_http_server(): + server = HTTPServer(remote_objects=['spectrometer', 'eventloop']) + server.start() + +if __name__ == '__main__()': + + Process(target=start_http_server).start() + + EventLoop( + OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number='USB2+H15897'), + instance_name='eventloop', + ).run() \ No newline at end of file diff --git a/doc/source/howto/code/eventloop/threaded.py b/doc/source/howto/code/eventloop/threaded.py new file mode 100644 index 0000000..14a5038 --- /dev/null +++ b/doc/source/howto/code/eventloop/threaded.py @@ -0,0 +1,24 @@ +from oceanoptics_spectrometer import OceanOpticsSpectrometer +from IDSCamera import UEyeCamera +from hololinked.server import EventLoop, HTTPServer +from multiprocessing import Process + + +def start_http_server(): + server = HTTPServer(remote_objects=['spectrometer', 'eventloop', 'camera']) + server.start() + +if __name__ == '__main__()': + + Process(target=start_http_server).start() + + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number='USB2+H15897') + + camera = UEyeCamera(instance_name='camera', camera_id=3) + + EventLoop( + [spectrometer, camera], + instance_name='eventloop', + threaded=True + ).run() \ No newline at end of file diff --git a/doc/source/howto/code/node-wot/actions_and_events.json b/doc/source/howto/code/node-wot/actions_and_events.json new file mode 100644 index 0000000..2cead39 --- /dev/null +++ b/doc/source/howto/code/node-wot/actions_and_events.json @@ -0,0 +1,25 @@ +{ + "actions" : { + "connect": { + "title": "connect", + "description" : "connect to the spectrometer specified by serial number", + "forms": [{ + "href": "https://example.com/spectrometer/connect", + "op": "invokeaction", + "htv:methodName": "POST", + "contentType": "application/json" + }] + } + }, + "events" : { + "measurement_event": { + "forms": [{ + "href": "https://example.com/spectrometer/intensity/measurement-event", + "op": "subscribeevent", + "htv:methodName": "GET", + "contentType": "text/event-stream" + }] + } + + } +} diff --git a/doc/source/howto/code/node-wot/intro.js b/doc/source/howto/code/node-wot/intro.js new file mode 100644 index 0000000..c897449 --- /dev/null +++ b/doc/source/howto/code/node-wot/intro.js @@ -0,0 +1,30 @@ +import 'wot-bundle.min.js'; + +servient = new Wot.Core.Servient(); // auto-imported by wot-bundle.min.js +servient.addClientFactory(new Wot.Http.HttpsClientFactory({ allowSelfSigned : true })) + +servient.start().then(async (WoT) => { + console.debug("WoT servient started") + let td = await WoT.requestThingDescription( + "https://example.com/spectrometer/resources/wot-td") + // replace with your own PC hostname + + spectrometer = await WoT.consume(td); + console.info("consumed thing description from spectrometer") + + // read and write property + await spectrometer.writeProperty("serial_number", { "value" : "USB2+H15897"}) + console.log(await (await spectrometer.readProperty("serial number")).value()) + + //call actions + await spectrometer.invokeAction("connect") + + spectrometer.subscribeEvent("measurement_event", async(data) => { + const value = await data.value() + console.log("event : ", value) + }).then((subscription) => { + console.debug("subscribed to intensity measurement event") + }) + + await spectrometer.invokeAction("capture") +}) diff --git a/doc/source/howto/code/node-wot/serial_number.json b/doc/source/howto/code/node-wot/serial_number.json new file mode 100644 index 0000000..b092d13 --- /dev/null +++ b/doc/source/howto/code/node-wot/serial_number.json @@ -0,0 +1,23 @@ +{ + "serial_number" : { + "title": "serial_number", + "description": "serial number of the spectrometer to connect/or connected", + "constant": false, + "readOnly": false, + "type": "string", + "forms": [ + { + "href": "https://example.com/spectrometer/serial-number", + "op": "readproperty", + "htv:methodName": "GET", + "contentType": "application/json" + }, + { + "href": "https://example.com/spectrometer/serial-number", + "op": "writeproperty", + "htv:methodName": "PUT", + "contentType": "application/json" + } + ] + } +} diff --git a/doc/source/howto/code/properties/common_arg_1.py b/doc/source/howto/code/properties/common_arg_1.py new file mode 100644 index 0000000..2cf1c8d --- /dev/null +++ b/doc/source/howto/code/properties/common_arg_1.py @@ -0,0 +1,37 @@ +from hololinked.server import RemoteObject +from hololinked.server.remote_parameters import String, Number, TypedList + + +class OceanOpticsSpectrometer(RemoteObject): + """ + Spectrometer example object + """ + + serial_number = String(default="USB2+H15897", allow_None=False, readonly=True, + doc="serial number of the spectrometer (string)") # type: str + + integration_time_millisec = Number(default=1000, bounds=(0.001, None), + crop_to_bounds=True, allow_None=False, + doc="integration time of measurement in milliseconds") + + model = String(default=None, allow_None=True, constant=True, + doc="model of the connected spectrometer") + + custom_background_intensity = TypedList(item_type=(float, int), default=None, + allow_None=True, + doc="user provided background substraction intensity") + + def __init__(self, instance_name, serial_number, integration_time) -> None: + super().__init__(instance_name=instance_name) + # allow_None + self.custom_background_intensity = None # OK + self.custom_background_intensity = [] # OK + self.custom_background_intensity = None # OK + # following raises TypeError because allow_None = False + self.integration_time_millisec = None + # readonly - following raises ValueError because readonly = True + self.serial_number = serial_number + # constant + self.model = None # OK - constant accepts None + self.model = 'USB2000+' # OK - can be set once + self.model = None # raises TypeError \ No newline at end of file diff --git a/doc/source/howto/code/properties/common_arg_2.py b/doc/source/howto/code/properties/common_arg_2.py new file mode 100644 index 0000000..1aa3736 --- /dev/null +++ b/doc/source/howto/code/properties/common_arg_2.py @@ -0,0 +1,33 @@ +from hololinked.server import RemoteObject, RemoteParameter +from enum import IntEnum + + +class ErrorCodes(IntEnum): + IS_NO_SUCCESS = -1 + IS_SUCCESS = 0 + IS_INVALID_CAMERA_HANDLE = 1 + IS_CANT_OPEN_DEVICE = 3 + IS_CANT_CLOSE_DEVICE = 4 + + @classmethod + def json(self): + # code to code name - opposite of enum definition + return {value.value : name for name, value in vars(self).items() if isinstance( + value, self)} + + +class IDSCamera(RemoteObject): + """ + Spectrometer example object + """ + error_codes = RemoteParameter(readonly=True, default=ErrorCodes.json(), + class_member=True, + doc="error codes raised by IDS library") + + def __init__(self, instance_name : str): + super().__init__(instance_name=instance_name) + print("error codes", IDSCamera.error_codes) # prints error codes + + +if __name__ == '__main__': + IDSCamera(instance_name='test') \ No newline at end of file diff --git a/doc/source/howto/code/properties/typed.py b/doc/source/howto/code/properties/typed.py new file mode 100644 index 0000000..f1294c5 --- /dev/null +++ b/doc/source/howto/code/properties/typed.py @@ -0,0 +1,33 @@ +from hololinked.server import RemoteObject +from hololinked.server.remote_parameters import String, Number + + +class OceanOpticsSpectrometer(RemoteObject): + """ + Spectrometer example object + """ + + serial_number = String(default="USB2+H15897", allow_None=False, readonly=True, + doc="serial number of the spectrometer (string)") # type: str + + integration_time_millisec = Number(default=1000, bounds=(0.001, None), + crop_to_bounds=True, + doc="integration time of measurement in milliseconds") + + def __init__(self, instance_name, serial_number, integration_time) -> None: + super().__init__(instance_name=instance_name) + self.serial_number = serial_number # raises ValueError because readonly=True + + @integration_time_millisec.setter + def set_integration_time(self, value): + # value is already validated as a float or int + # & cropped to specified bounds when this setter invoked + self.device.integration_time_micros(int(value*1000)) + self._integration_time_ms = int(value) + + @integration_time_millisec.getter + def get_integration_time(self): + try: + return self._integration_time_ms + except: + return self.parameters["integration_time_millisec"].default \ No newline at end of file diff --git a/doc/source/howto/code/properties/untyped.py b/doc/source/howto/code/properties/untyped.py new file mode 100644 index 0000000..d3c450b --- /dev/null +++ b/doc/source/howto/code/properties/untyped.py @@ -0,0 +1,32 @@ +from hololinked.server import RemoteObject, RemoteParameter + + +class TestObject(RemoteObject): + + my_untyped_serializable_attribute = RemoteParameter(default=5, + allow_None=False, doc="this parameter can hold any value") + + my_custom_typed_serializable_attribute = RemoteParameter(default=[2, "foo"], + allow_None=False, doc="this parameter can hold any value") + + @my_custom_typed_serializable_attribute.getter + def get_param(self): + try: + return self._foo + except AttributeError: + return self.parameters.descriptors["my_custom_typed_serializable_attribute"].default + + @my_custom_typed_serializable_attribute.setter + def set_param(self, value): + if isinstance(value, (list, tuple)) and len(value) < 100: + for index, val in enumerate(value): + if not isinstance(val, (str, int, type(None))): + raise ValueError(f"Value at position {index} not acceptable member" + " type of my_custom_typed_serializable_attribute", + f" but type {type(val)}") + self._foo = value + else: + raise TypeError(f"Given type is not list or tuple", + f" for my_custom_typed_serializable_attribute but type {type(value)}") + + diff --git a/doc/source/howto/code/rpc.py b/doc/source/howto/code/rpc.py new file mode 100644 index 0000000..7dfb2ce --- /dev/null +++ b/doc/source/howto/code/rpc.py @@ -0,0 +1,87 @@ +import logging, os, ssl +from multiprocessing import Process +import threading +from hololinked.server import HTTPServer, Thing, Property, action, Event +from hololinked.server.constants import HTTP_METHODS +from hololinked.server.properties import String, List +from seabreeze.spectrometers import Spectrometer + + +class OceanOpticsSpectrometer(Thing): + """ + Spectrometer example object + """ + + serial_number = String(default=None, allow_None=True, constant=True, + URL_path='/serial-number', + doc="serial number of the spectrometer") # type: str + + def __init__(self, instance_name, serial_number, autoconnect, **kwargs): + super().__init__(instance_name=instance_name, **kwargs) + self.serial_number = serial_number + if autoconnect and self.serial_number is not None: + self.connect() + self.measurement_event = Event(name='intensity-measurement') + self._acquisition_thread = None + + @action(URL_path='/connect') + def connect(self, trigger_mode = None, integration_time = None): + self.device = Spectrometer.from_serial_number(self.serial_number) + if trigger_mode: + self.device.trigger_mode(trigger_mode) + if integration_time: + self.device.integration_time_micros(integration_time) + + intensity = List(default=None, allow_None=True, doc="captured intensity", + readonly=True, fget=lambda self: self._intensity.tolist()) + + def capture(self): + self._run = True + while self._run: + self._intensity = self.device.intensities( + correct_dark_counts=True, + correct_nonlinearity=True + ) + self.measurement_event.push(self._intensity.tolist()) + + @action(URL_path='/acquisition/start', http_method=HTTP_METHODS.POST) + def start_acquisition(self): + if self._acquisition_thread is None: + self._acquisition_thread = threading.Thread(target=self.capture) + self._acquisition_thread.start() + + @action(URL_path='/acquisition/stop', http_method=HTTP_METHODS.POST) + def stop_acquisition(self): + if self._acquisition_thread is not None: + self.logger.debug(f"stopping acquisition thread with thread-ID {self._acquisition_thread.ident}") + self._run = False # break infinite loop + # Reduce the measurement that will proceed in new trigger mode to 1ms + self._acquisition_thread.join() + self._acquisition_thread = None + + +def start_https_server(): + ssl_context = ssl.SSLContext(protocol=ssl.PROTOCOL_TLS) + ssl_context.load_cert_chain(f'assets{os.sep}security{os.sep}certificate.pem', + keyfile = f'assets{os.sep}security{os.sep}key.pem') + # You need to create a certificate on your own or use without one + # for quick-start but events will not be supported by browsers + # if there is no SSL + + HTTPServer(['spectrometer'], port=8083, ssl_context=ssl_context, + log_level=logging.DEBUG).listen() + + +if __name__ == "__main__": + + Process(target=start_https_server).start() + + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number=None, autoconnect=False) + spectrometer.run(zmq_protocols="IPC") + + # example code, but will never reach here unless exit() is called by the client + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serializer='msgpack', serial_number=None, autoconnect=False) + spectrometer.run(zmq_protocols=["TCP", "IPC"], + tcp_socket_address="tcp://0.0.0.0:6539") \ No newline at end of file diff --git a/doc/source/howto/code/rpc_client.py b/doc/source/howto/code/rpc_client.py new file mode 100644 index 0000000..0f002cd --- /dev/null +++ b/doc/source/howto/code/rpc_client.py @@ -0,0 +1,28 @@ +from hololinked.client import ObjectProxy + +spectrometer_proxy = ObjectProxy(instance_name='spectrometer', + serializer='msgpack', protocol='IPC') +# setting and getting property +spectrometer_proxy.serial_number = 'USB2+H15897' +print(spectrometer_proxy.serial_number) +# calling actions +spectrometer_proxy.connect() +spectrometer_proxy.capture() + +exit(0) + +# TCP and event example +from hololinked.client import ObjectProxy +from oceanoptics_spectrometer import OceanOpticsSpectrometer + +spectrometer_proxy = ObjectProxy(instance_name='spectrometer', protocol='TCP', + serializer='msgpack', socket_address="tcp://192.168.0.100:6539") # type: OceanOpticsSpectrometer +spectrometer_proxy.serial_number = 'USB2+H15897' +spectrometer_proxy.connect() # provides type definitions corresponding to server + +def event_cb(event_data): + print(event_data) + +spectrometer_proxy.subscribe_event(name='intensity-measurement', callbacks=event_cb) +# name can be the value for name given to the event in the server side or the +# python attribute where the Event was assigned. diff --git a/doc/source/howto/code/thing_inheritance.py b/doc/source/howto/code/thing_inheritance.py new file mode 100644 index 0000000..b7370c4 --- /dev/null +++ b/doc/source/howto/code/thing_inheritance.py @@ -0,0 +1,16 @@ +from hololinked.server import Thing, Property, action, Event + +class Spectrometer(Thing): + """ + add class doc here + """ + def __init__(self, instance_name, serial_number, autoconnect, **kwargs): + super().__init__(instance_name=instance_name, **kwargs) + self.serial_number = serial_number + if autoconnect: + self.connect() + + def connect(self): + """implemenet device driver logic to connect to hardware""" + pass + \ No newline at end of file diff --git a/doc/source/howto/code/thing_with_http_server.py b/doc/source/howto/code/thing_with_http_server.py new file mode 100644 index 0000000..f00b17a --- /dev/null +++ b/doc/source/howto/code/thing_with_http_server.py @@ -0,0 +1,96 @@ +import threading, logging +from hololinked.server import Thing, Property, action, Event +from hololinked.server.properties import Number, Selector, String, List +from hololinked.server.constants import HTTP_METHODS +from seabreeze.spectrometers import Spectrometer + + +class OceanOpticsSpectrometer(Thing): + """ + Spectrometer example object + """ + + serial_number = String(default=None, allow_None=True, constant=True, + URL_path='/serial-number', + doc="serial number of the spectrometer") # type: str + + def __init__(self, instance_name, serial_number, autoconnect, **kwargs): + super().__init__(instance_name=instance_name, serial_number=serial_number, + **kwargs) + if autoconnect and self.serial_number is not None: + self.connect(trigger_mode=0, integration_time=int(1e6)) # let's say, by default + self._acquisition_thread = None + self.measurement_event = Event(name='intensity-measurement') + + @action(URL_path='/connect') + def connect(self, trigger_mode, integration_time): + self.device = Spectrometer.from_serial_number(self.serial_number) + if trigger_mode: + self.device.trigger_mode(trigger_mode) + if integration_time: + self.device.integration_time_micros(integration_time) + + integration_time = Number(default=1000, bounds=(0.001, None), crop_to_bounds=True, + URL_path='/integration-time', + doc="integration time of measurement in milliseconds") + + @integration_time.setter + def apply_integration_time(self, value : float): + self.device.integration_time_micros(int(value*1000)) + self._integration_time = int(value) + + @integration_time.getter + def get_integration_time(self) -> float: + try: + return self._integration_time + except: + return self.parameters["integration_time"].default + + trigger_mode = Selector(objects=[0, 1, 2, 3, 4], default=0, URL_path='/trigger-mode', + doc="""0 = normal/free running, 1 = Software trigger, 2 = Ext. Trigger Level, + 3 = Ext. Trigger Synchro/ Shutter mode, 4 = Ext. Trigger Edge""") + + @trigger_mode.setter + def apply_trigger_mode(self, value : int): + self.device.trigger_mode(value) + self._trigger_mode = value + + @trigger_mode.getter + def get_trigger_mode(self): + try: + return self._trigger_mode + except: + return self.parameters["trigger_mode"].default + + intensity = List(default=None, allow_None=True, doc="captured intensity", + readonly=True, fget=lambda self: self._intensity.tolist()) + + def capture(self): + self._run = True + while self._run: + self._intensity = self.device.intensities( + correct_dark_counts=False, + correct_nonlinearity=False + ) + self.measurement_event.push(self._intensity.tolist()) + self.logger.debug(f"pushed measurement event") + + @action(URL_path='/acquisition/start', http_method=HTTP_METHODS.POST) + def start_acquisition(self): + if self._acquisition_thread is None: + self._acquisition_thread = threading.Thread(target=self.capture) + self._acquisition_thread.start() + + @action(URL_path='/acquisition/stop', http_method=HTTP_METHODS.POST) + def stop_acquisition(self): + if self._acquisition_thread is not None: + self.logger.debug(f"stopping acquisition thread with thread-ID {self._acquisition_thread.ident}") + self._run = False # break infinite loop + self._acquisition_thread.join() + self._acquisition_thread = None + + +if __name__ == '__main__': + spectrometer = OceanOpticsSpectrometer(instance_name='spectrometer', + serial_number='S14155', autoconnect=True, log_level=logging.DEBUG) + spectrometer.run_with_http_server(port=3569) diff --git a/doc/source/howto/eventloop.rst b/doc/source/howto/eventloop.rst new file mode 100644 index 0000000..cf9ab0f --- /dev/null +++ b/doc/source/howto/eventloop.rst @@ -0,0 +1,35 @@ +Customizing Eventloop +===================== + +EventLoop object is a server side object that runs both the ZMQ message listeners & executes the operations +of the ``RemoteObject``. Operations only include parameter read-write & method execution, events are pushed synchronously +wherever they are called. +EventLoop is also a ``RemoteObject`` by itself. A default eventloop is created by ``RemoteObject.run()`` method to +simplify the usage, however, one may benefit from using it directly. + +To start a ``RemoteObject`` using the ``EventLoop``, pass the instantiated object to the ``__init__()``: + +.. literalinclude:: code/eventloop/run_eq.py + :language: python + :linenos: + +Exposing the EventLoop allows to add new ``RemoteObject``'s on the fly whenever necessary. To run multiple objects +in the same eventloop, pass the objects as a list. + +.. literalinclude:: code/eventloop/list_of_devices.py + :language: python + :linenos: + :lines: 7- + +Setting threaded to True calls each RemoteObject in its own thread. + +.. literalinclude:: code/eventloop/threaded.py + :language: python + :linenos: + :lines: 20- + +Use proxies to import a new object from somewhere else: + +.. literalinclude:: code/eventloop/import.py + :language: python + :linenos: \ No newline at end of file diff --git a/doc/source/howto/http_server.rst b/doc/source/howto/http_server.rst new file mode 100644 index 0000000..5873f11 --- /dev/null +++ b/doc/source/howto/http_server.rst @@ -0,0 +1,45 @@ +Connect HTTP server to RemoteObject +=================================== + +To also use a HTTP server, one needs to specify URL paths and HTTP request verb for the parameters and methods. + +one needs to start a instance of ``HTTPServer`` before ``run()``. When passed to the ``run()``, +the ``HTTPServer`` will communicate with the ``RemoteObject`` through the fastest means +possible - intra-process communication. + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + +The ``HTTPServer`` and ``RemoteObject`` will run in different threads and the python global +interpreter lock will still allow only one thread at a time. + +One can store captured data in parameters & push events to supply clients with the measured +data: + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + +When using HTTP server, events will also be tunneled as HTTP server sent events at the specifed URL +path. + +Endpoints available to HTTP server are constructed as follows: + +.. list-table:: + + * - remote resource + - default HTTP request method + - URL construction + * - parameter read + - GET + - `http(s)://{domain name}/{instance name}/{parameter URL path}` + * - parameter write + - PUT + - `http(s)://{domain name}/{instance name}/{parameter URL path}` + * - method execution + - POST + - `http(s)://{domain name}/{instance name}/{method URL path}` + * - Event + - GET + - `http(s)://{domain name}/{instance name}/{event URL path}` diff --git a/doc/source/howto/index.rst b/doc/source/howto/index.rst new file mode 100644 index 0000000..a5d402a --- /dev/null +++ b/doc/source/howto/index.rst @@ -0,0 +1,82 @@ +.. |module| replace:: hololinked + +.. |module-highlighted| replace:: ``hololinked`` + +.. toctree:: + :hidden: + :maxdepth: 2 + + Expose Python Classes + clients + + +Expose Python Classes +===================== + +Python objects visible on the network or to other processes are made by subclassing from ``Thing``: + +.. literalinclude:: code/thing_inheritance.py + :language: python + :linenos: + +``instance_name`` is a unique name recognising the instantiated object. It allows multiple +instruments of same type to be connected to the same computer without overlapping the exposed interface and is therefore a +mandatory argument to be supplied to the ``Thing`` parent. When maintained unique within the network, it allows +identification of the hardware itself. Non-experts may use strings composed of +characters, numbers, dashes and forward slashes, which looks like part of a browser URL, but the general definition is +that ``instance_name`` should be a URI compatible string. + +For attributes (like serial number above), if one requires them to be exposed on the network, one should +use "properties" defined in ``hololinked.server.properties`` to "type define" (in a python sense) attributes of the object. + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + :lines: 2, 5-19 + +Only properties defined in ``hololinked.server.properties`` or subclass of ``Property`` object (note the captial 'P') +can be exposed to the network, not normal python attributes or python's own ``property``. + +For methods to be exposed on the network, one can use the ``action`` decorator: + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + :lines: 2-3, 7-19, 24-31 + +Arbitrary signature is permitted. Arguments are loosely typed and may need to be constrained with a schema, based +on the robustness the developer is expecting in their application. However, a schema is optional and it only matters that +the method signature is matching when requested from a client. + +To start a HTTP server for the ``Thing``, one can call the ``run_with_http_server()`` method after instantiating the +``Thing``. The supplied ``URL_path`` to the actions and properties are used by this HTTP server: + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + :lines: 93-96 + + +By default, this starts a server a HTTP server and an INPROC zmq socket (GIL constrained intra-process as far as python is +concerned) for the HTTP server to direct the requests to the ``Thing`` object. All requests are queued by default as the +domain of operation under the hood is remote procedure calls (RPC). + +One can store captured data in properties & push events to supply clients with the measured data: + +.. literalinclude:: code/thing_with_http_server.py + :language: python + :linenos: + :lines: 2-3, 5-19, 64-82 + +Events can be defined as class or instance attributes and will be tunnelled as HTTP server sent events. +Events are to be used to asynchronously push data to clients. + +It can be summarized that the three main building blocks of a network exposed object, or a hardware ``Thing`` are: + +* properties - use them to model settings of instrumentation (both hardware and software-only), + expose general class/instance attributes, captured & computed data +* actions - use them to issue commands to instruments like start and stop acquisition, connect/disconnect etc. +* events - push measured data, create alerts/alarms, inform availability of certain type of data etc. + +Each are separately discussed in depth in their respective sections within the doc found on the section navigation. + diff --git a/doc/source/howto/methods/index.rst b/doc/source/howto/methods/index.rst new file mode 100644 index 0000000..48a0e75 --- /dev/null +++ b/doc/source/howto/methods/index.rst @@ -0,0 +1,15 @@ +Actions (or remote methods) +=========================== + +Only methods decorated with ``action()`` are exposed to clients. + +.. literalinclude:: ../code/4.py + :lines: 1-10, 26-36 + +Since python is loosely typed, the server may need to verify the argument types +supplied by the client call. This verification is left to the developer and there +is no elaborate support for this. One may consider using a ``ParameterizedFunction`` for +this or supplying a JSON schema to the argument ``input_schema`` of ``remote_method`` + +To constrain method excecution for certain states of the StateMachine, one can +set the state in the decorator. \ No newline at end of file diff --git a/doc/source/howto/properties/arguments.rst b/doc/source/howto/properties/arguments.rst new file mode 100644 index 0000000..f5e7dcd --- /dev/null +++ b/doc/source/howto/properties/arguments.rst @@ -0,0 +1,88 @@ +Common arguments to all parameters +================================== + +``allow_None``, ``constant`` & ``readonly`` ++++++++++++++++++++++++++++++++++++++++++++ + +* if ``allow_None`` is ``True``, parameter supports ``None`` apart from its own type +* ``readonly`` (being ``True``) makes the parameter read-only or execute the getter method +* ``constant`` (being ``True``), again makes the parameter read-only but can be set once if ``allow_None`` is ``True``. + This is useful the set the parameter once at ``__init__()`` but remain constant after that. + +.. literalinclude:: ../code/parameters/common_arg_1.py + :language: python + :linenos: + +``default``, ``class_member``, ``fget``, ``fset`` & ``fdel`` +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + +To provide a getter-setter (& deleter) method is optional. If none given, value is stored inside +the instance's ``__dict__`` under the name `_param_value`. If no such +value was stored originally because a value assignment was never called on the parameter, say during +``__init__``, ``default`` is returned. + +If a setter/deleter is given, getter is mandatory. In this case, ``default`` is also ignored & the getter is +always executed. If default is desirable, one has to return it manually in the getter method by +accessing the parameter descriptor object directly. + +If ``class_member`` is True, the value is set in the class' ``__dict__`` instead of instance's ``__dict__``. +Custom getter-setter-deleter are not compatible with this option currently. ``class_member`` takes precedence over fget-fset-fdel, +which in turn has precedence over ``default``. + +.. literalinclude:: ../code/parameters/common_arg_2.py + :language: python + :linenos: + :lines: 5-29 + +``class_member`` can still be used with a default value if there is no custom fget-fset-fdel. + +``doc`` and ``label`` +++++++++++++++++++++++ + +``doc`` allows clients to fetch a docstring for the parameter. ``label`` can be used to show the parameter +in a GUI for example. hololinked-portal uses these two values in the same fashion. + +``remote`` +++++++++++ + +setting remote to False makes the parameter local, this is still useful to type-restrict python attributes to +provide an interface to other developers using your class, for example, when someone else inherits your ``RemoteObject``. + +``URL_path`` and ``http_method`` +++++++++++++++++++++++++++++++++ + +This setting is applicable only to the ``HTTPServer``. ``URL_path`` makes the parameter available for +getter-setter-deleter methods at the specified URL. The default http request verb/method for getter is GET, +setter is PUT and deleter is DELETE. If one wants to change the setter to POST method instead of PUT, +one can set ``http_method = ("GET", "POST", "DELETE")``. Even without the custom getter-setter +(which generates the above stated internal name for the parameter), one can modify the ``http_method``. +Setting any of the request methods to ``None`` makes the parameter in-accessible for that respective operation. + +``state`` ++++++++++ + +When ``state`` is specifed, the parameter is writeable only when the RemoteObject's StateMachine is in that state (or +in the list of allowed states). This is also currently applicable only when set operations are called by clients. +Local set operations are always executed irrespective of the state machine state. A get operation is always executed as +well even from the clients irrespective of the state. + +``metadata`` +++++++++++++ + +This dictionary allows storing arbitrary metadata in a dictionary. For example, one can store units of the physical +quantity. + +``db_init``, ``db_commit`` & ``db_persist`` ++++++++++++++++++++++++++++++++++++++++++++ + +Parameters can be stored & loaded in a database if necessary when the ``RemoteObject`` is stopped and restarted. + +* ``db_init`` only loads a parameter from database, when the value is changed, its not written back to the database. + For this option, the value has to be pre-created in the database in some other fashion. hololinked-portal can help here. + +* ``db_commit`` only writes the value into the database when an assignment is called. + +* ``db_persist`` both stores and loads the parameter from the database. + +Supported databases are MySQL, Postgres & SQLite currently. Look at database how-to for supply database configuration. + diff --git a/doc/source/howto/properties/extending.rst b/doc/source/howto/properties/extending.rst new file mode 100644 index 0000000..bbffd13 --- /dev/null +++ b/doc/source/howto/properties/extending.rst @@ -0,0 +1,9 @@ +Extending +========= + +Remote Parameters can also be extended to define custom types based on specific requirement. +Type validation is carried out in a method ``validate_and_adapt()`` of the ``RemoteParameter`` class +or its child. The given value can also be coerced/adapted if necessary. + + + diff --git a/doc/source/howto/properties/index.rst b/doc/source/howto/properties/index.rst new file mode 100644 index 0000000..e8db201 --- /dev/null +++ b/doc/source/howto/properties/index.rst @@ -0,0 +1,103 @@ +Properties In-Depth +=================== + +Properties expose python attributes to clients & support custom get-set(-delete) functions. +``hololinked`` uses ``param`` under the hood to implement properties. + +.. toctree:: + :hidden: + :maxdepth: 1 + + arguments + extending + +Untyped Property +----------------- + +To make a property take any value, use the base class ``Property`` + +.. literalinclude:: ../code/properties/untyped.py + :language: python + :linenos: + :lines: 1-11 + +The descriptor object (instance of ``Property``) that performs the get-set operations & auto-allocation +of an internal instance variable for the property can be accessed by the instance under +``self.properties.descriptors[""]``. Expectedly, the value of the property must +be serializable to be read by the clients. Read the serializer section for further details & customization. + +Custom Typed +------------ + +To support custom get & set methods so that an internal instance variable is not created automatically, +use the getter & setter decorator or pass a method to the fget & fset arguments of the property: + +.. literalinclude:: ../code/properties/untyped.py + :language: python + :linenos: + :lines: 1-30 + + +Typed Properties +---------------- + +Certain typed properties are already available in ``hololinked.server.properties``, +defined by ``param``. + +.. list-table:: + + * - type + - property class + - options + * - str + - ``String`` + - comply to regex + * - integer + - ``Integer`` + - min & max bounds, inclusive bounds, crop to bounds, multiples + * - float, integer + - ``Number`` + - min & max bounds, inclusive bounds, multiples + * - bool + - ``Boolean`` + - + * - iterables + - ``Iterable`` + - length/bounds, item_type, dtype (allowed type of iterable like list, tuple) + * - tuple + - ``Tuple`` + - same as iterable + * - list + - ``List`` + - same as iterable + * - one of many objects + - ``Selector`` + - allowed list of objects + * - one or more of many objects + - ``TupleSelector``` + - allowed list of objects + * - class, subclass or instance of an object + - ``ClassSelector`` + - instance only or class only + * - path, filename & folder names + - ``Path``, ``Filename``, ``Foldername`` + - + * - datetime + - ``Date`` + - format + * - typed list + - ``TypedList`` + - typed appends, extends + * - typed dictionary + - ``TypedDict``, ``TypedKeyMappingsDict`` + - typed updates, assignments + +As an example: + +.. literalinclude:: ../code/properties/typed.py + :language: python + :linenos: + +When providing a custom setter for typed properties, the value is internally validated before +passing to the setter method. The return value of getter method is never validated and +is left to the programmer's choice. \ No newline at end of file diff --git a/doc/source/howto/remote_object.rst b/doc/source/howto/remote_object.rst new file mode 100644 index 0000000..69f429d --- /dev/null +++ b/doc/source/howto/remote_object.rst @@ -0,0 +1,14 @@ +RemoteObject In-Depth +===================== + +Change Protocols +---------------- + +``hololinked`` uses ZeroMQ under the hood to mediate messages between client and server. +Any ``RemoteObject`` can be constrained to + +* only intra-process communication for single process apps +* inter-process communication for multi-process apps +* network communication using TCP and/or HTTP + +simply by specify the requirement as an argument. \ No newline at end of file diff --git a/doc/source/howto/serializers.rst b/doc/source/howto/serializers.rst new file mode 100644 index 0000000..0ddfd26 --- /dev/null +++ b/doc/source/howto/serializers.rst @@ -0,0 +1,2 @@ +Customizing Serializers +======================= \ No newline at end of file diff --git a/doc/source/index.rst b/doc/source/index.rst index 08ae350..016757b 100644 --- a/doc/source/index.rst +++ b/doc/source/index.rst @@ -7,51 +7,52 @@ .. |module-highlighted| replace:: ``hololinked`` -Welcome to |module|'s documentation! -==================================== +.. |base-class-highlighted| replace:: ``Thing`` -|module-highlighted| is (supposed to be) a versatile and pythonic tool for building custom control and data acquisition software systems. If you have a requirement -to capture data from your hardware/instruments remotely in your local network, control them, show the data in a browser/dashboard, provide a Qt-GUI or run automated scripts, |module-highlighted| can help. Even if you wish to do -data-acquisition/control locally in a single computer, one can still separate the concerns of GUI & device or integrate with web-browser for a modern interface. -|module-highlighted| was created & is being designed with the following features in mind: +|module| - Pythonic Supervisory Control & Data Acquisition / Internet of Things +=============================================================================== + +|module-highlighted| is (supposed to be) a versatile and pythonic tool for building custom control and data acquisition +software systems. If you have a requirement to capture data from your hardware/instrumentation remotely through your +domain network, control them, show the data in a browser/dashboard, provide a Qt-GUI or run automated scripts, +|module-highlighted| can help. Even if you wish to do data-acquisition/control locally in a single computer, one can still +separate the concerns of GUI & device or integrate with web-browser for a modern interface or use modern web development +based tools. |module-highlighted| is being developed with the following features in mind: * being truly pythonic - all code in python & all features of python +* reasonable integration with HTTP to take advantage of modern web practices * easy to understand & setup -* reasonable integration with HTTP to take advantage of browser based GUI frameworks (like ReactJS) * agnostic to system size & flexibility in topology -* 30FPS 1280*1080*3 image streaming over HTTP -In short - to use it in your home/hobby, in a lab or in a big research facility & industry. +In short - to use it in your home/hobby, in a lab or in a research facility & industry. -|module-highlighted| is primarily object oriented & the building block is the ``RemoteObject`` class. Your instrument class (i.e. the python object that controls the hardware) should inherit this class. Each such -class provides remote methods, remote attributes (also called remote parameters) & events which become accessible on the network through HTTP and/or TCP -after implementation by the |module-highlighted| developer. Interprocess communication (ZMQ's IPC) is available for restriction to single-computer applications. -Remote methods can be used to run control and measurement operations on your instruments or arbitrary python logic. -Remote parameters are type-checked object attributes with getter-setter options (identical to python ``property`` with added network access). -Events allow to asynchronously push arbitrary data to clients. Once such a ``RemoteObject`` is instantiated, it can be connected with the server of choice. +|module-highlighted| is compatible with the `Web of Things `_ recommended pattern for developing +hardware/instrumentation control software. Each device or thing can be controlled systematically when their design in +software is segregated into properties, actions and events. |module-highlighted| is object-oriented, therefore: -.. note:: - web developers & software engineers, consider reading the :ref:`note ` section +* properties are validated get-set attributes of the class which may be used to model device settings, hold captured/computed data etc. +* actions are methods which issue commands to the device or run arbitrary python logic. +* events can asynchronously communicate/push data to a client, like alarm messages, streaming captured data etc. +The base class which enables this classification is the ``Thing`` class. Any class that inherits the ``Thing`` class can +instantiate properties, actions and events which become visible to a client in this segragated manner. Please follow the documentation for examples & tutorials, how-to's and API reference. +.. note:: + web developers & software engineers, consider reading the :ref:`note ` section + .. toctree:: :maxdepth: 1 + :hidden: :caption: Contents: + installation - examples/index - benchmark/index + How Tos autodoc/index - note + development_notes -Indices and tables -================== +:ref:`genindex` -* :ref:`genindex` -* :ref:`modindex` -* :ref:`search` - -.. note:: - This project is under development and is an idealogical state. Please use it only for playtesting or exploring. +last build : |today| UTC \ No newline at end of file diff --git a/doc/source/installation.rst b/doc/source/installation.rst index c24c01f..073f500 100644 --- a/doc/source/installation.rst +++ b/doc/source/installation.rst @@ -1,15 +1,19 @@ .. |module-highlighted| replace:: ``hololinked`` -Installation -============ +Installation & Examples +======================= -As |module-highlighted| is still in idealogical & development state, it is recommended to clone it from github & install in develop mode. +.. code:: shell + + pip install hololinked + +One may also clone it from github & install directly (in develop mode). .. code:: shell git clone https://github.com/VigneshVSV/hololinked.git -One could setup a conda environment from the included ``hololinked.yml`` file +Either install the dependencies in requirements file or one could setup a conda environment from the included ``hololinked.yml`` file .. code:: shell @@ -30,7 +34,20 @@ Also check out: - repository containing example code discussed in this documentation * - hololinked-portal - https://github.com/VigneshVSV/hololinked-portal.git - - GUI to access RemoteObjects & Data Visualization helper + - GUI to access ``Thing``s and interact with their properties, actions and events. +To build & host docs locally, in top directory: + +.. code:: shell + conda activate hololinked + cd doc + make clean + make html + python -m http.server --directory build\html + +To open the docs in the default browser, one can also issue the following instead of starting a python server + +.. code:: shell + make host-doc \ No newline at end of file diff --git a/doc/source/note.rst b/doc/source/note.rst deleted file mode 100644 index 7727011..0000000 --- a/doc/source/note.rst +++ /dev/null @@ -1,48 +0,0 @@ -.. |module-highlighted| replace:: ``hololinked`` - -.. _note: - -design note -=========== - -In the interest of information to software engineers and web developers, the main difference of the above to a conventional RPC or REST(-like) paradigm in HTTP is that, -|module-highlighted| attempts to be a hybrid of both. For instrument control & data-acquisition, it is difficult to move away completely from RPC to REST. Besides, most instrument drivers/hardware -allow only a single persistent connection instead of multiple clients or computers. Further, when such a client process talks to an instrument, only one instruction can be sent at a time. -On the other hand, HTTP Servers are multi-threaded or asyncio oriented by design and REST(-like) API honestly does not seem to care how many simultaneous operations are run. -To reconcile both, the following is proposed: - -* |module-highlighted| gives the freedom to choose the HTTP request method & end-point URL desirable for each method, parameter and event -* All HTTP requests will be queued and executed serially unless threaded or made async manually by the programmer -* Verb like URLs may be used for methods & noun-like URLs are suggested to be used for parameters and events -* HTTP request method may be mapped as follows: - -.. list-table:: - :header-rows: 1 - - * - HTTP request verb/method - - remote parameter - - remote method - - event - * - GET - - read parameter value - - run method which gives a return value with useful data (which may be difficult or illogical as a `parameter`) - - stream data (for example - measured physical quantities) - * - POST - - add dynamic parameters with certain settings - - run python logic, methods that connect/disconnect or issue commands to instruments (RPC) - - not applicable - * - PUT - - write parameter value - - change value of a resource which is difficult to factor into a parameter - - not applicable - * - DELETE - - remove a dynamic parameter - - developer's interpretation - - not applicable - * - PATCH - - change settings of a parameter - - change partial value of a resource which is difficult to factor into a parameter or change settings of a parameter with custom logic - - not applicable - -Despite the above, an object proxy can also directly access the methods, parameters and events without the details of HTTP. - diff --git a/doc/source/requirements.txt b/doc/source/requirements.txt index 6f60f02..4cd43e3 100644 --- a/doc/source/requirements.txt +++ b/doc/source/requirements.txt @@ -1,4 +1,4 @@ -sphinx==7.2.6 +sphinx==7.3.7 sphinx-copybutton==0.5.2 sphinxcontrib-applehelp==1.0.7 sphinxcontrib-devhelp==1.0.5 @@ -6,4 +6,19 @@ sphinxcontrib-htmlhelp==2.0.4 sphinxcontrib-jsmath==1.0.1 sphinxcontrib-qthelp==1.0.6 sphinxcontrib-serializinghtml==1.1.9 -pydata-sphinx-theme==0.14.3 \ No newline at end of file +pydata-sphinx-theme==0.15.3 +numpydoc==1.6.0 +sphinx-toolbox==3.5.0 +argon2-cffi==23.1.0 +ConfigParser==6.0.0 +ifaddr==0.2.0 +ipython==8.21.0 +numpy==1.26.4 +pandas==2.2.0 +pyzmq==25.1.0 +serpent==1.41 +setuptools==68.0.0 +SQLAlchemy==2.0.21 +SQLAlchemy_Utils==0.41.1 +tornado==6.3.3 +msgspec==0.18.6 \ No newline at end of file diff --git a/hololinked.yml b/hololinked.yml index dc54951..26df6cb 100644 --- a/hololinked.yml +++ b/hololinked.yml @@ -1,5 +1,6 @@ name: hololinked channels: + - anaconda - conda-forge - defaults dependencies: @@ -13,13 +14,14 @@ dependencies: - beautifulsoup4=4.12.2=pyha770c72_0 - brotli-python=1.0.9=py311hd77b12b_7 - bzip2=1.0.8=he774522_0 - - ca-certificates=2023.7.22=h56e8100_0 - - certifi=2023.7.22=pyhd8ed1ab_0 + - ca-certificates=2023.08.22=haa95532_0 + - certifi=2023.11.17=py311haa95532_0 - cffi=1.15.1=py311h2bbff1b_3 - charset-normalizer=3.3.2=pyhd8ed1ab_0 - colorama=0.4.6=pyhd8ed1ab_0 - colour=0.1.5=py_0 - cryptography=41.0.3=py311h89fc84f_0 + - docopt=0.6.2=py_1 - docutils=0.20.1=py311h1ea47a8_2 - furl=2.1.3=pyhd8ed1ab_0 - greenlet=2.0.1=py311hd77b12b_0 @@ -31,6 +33,7 @@ dependencies: - intervals=0.9.2=pyhd8ed1ab_0 - jinja2=3.1.2=pyhd8ed1ab_1 - libffi=3.4.4=hd77b12b_0 + - libpq=12.15=h906ac69_0 - libsodium=1.0.18=h8d14728_1 - markupsafe=2.1.1=py311h2bbff1b_0 - openssl=3.0.12=h2bbff1b_0 @@ -40,6 +43,8 @@ dependencies: - pendulum=2.1.2=pyhd8ed1ab_1 - phonenumbers=8.13.23=pyhd8ed1ab_0 - pip=23.3=py311haa95532_0 + - pipreqs=0.4.13=pyhd8ed1ab_0 + - psycopg2=2.9.3=py311h2bbff1b_1 - pycparser=2.21=pyhd8ed1ab_0 - pydata-sphinx-theme=0.14.3=pyhd8ed1ab_0 - pygments=2.16.1=pyhd8ed1ab_0 @@ -49,6 +54,7 @@ dependencies: - python_abi=3.11=2_cp311 - pytz=2023.3.post1=pyhd8ed1ab_0 - pytzdata=2020.1=pyh9f0ad1d_0 + - pyyaml=6.0.1=py311h2bbff1b_0 - pyzmq=25.1.0=py311hd77b12b_0 - requests=2.31.0=pyhd8ed1ab_0 - serpent=1.41=pyhd8ed1ab_0 @@ -90,6 +96,33 @@ dependencies: - wheel=0.41.2=py311haa95532_0 - win_inet_pton=1.1.0=pyhd8ed1ab_6 - xz=5.4.2=h8cc25b3_0 + - yaml=0.2.5=he774522_0 + - yarg=0.1.9=py_1 - zeromq=4.3.4=h0e60522_1 - zipp=3.17.0=pyhd8ed1ab_0 - zlib=1.2.13=h8cc25b3_0 + - pip: + - apeye==1.4.1 + - apeye-core==1.1.4 + - autodocsumm==0.2.12 + - cachecontrol==0.13.1 + - cssutils==2.9.0 + - dict2css==0.3.0.post1 + - domdf-python-tools==3.8.0.post2 + - filelock==3.13.1 + - html5lib==1.1 + - msgpack==1.0.7 + - msgspec==0.18.6 + - natsort==8.4.0 + - numpydoc==1.6.0 + - platformdirs==4.1.0 + - ruamel-yaml==0.18.5 + - ruamel-yaml-clib==0.2.8 + - sphinx-autodoc-typehints==1.25.3 + - sphinx-jinja2-compat==0.2.0.post1 + - sphinx-prompt==1.8.0 + - sphinx-tabs==3.4.5 + - sphinx-toolbox==3.5.0 + - tabulate==0.9.0 + - webencodings==0.5.1 +prefix: C:\Users\vvign\.conda\envs\hololinked diff --git a/hololinked/__init__.py b/hololinked/__init__.py new file mode 100644 index 0000000..b3f4756 --- /dev/null +++ b/hololinked/__init__.py @@ -0,0 +1 @@ +__version__ = "0.1.2" diff --git a/hololinked/client/__init__.py b/hololinked/client/__init__.py new file mode 100644 index 0000000..7d26a6b --- /dev/null +++ b/hololinked/client/__init__.py @@ -0,0 +1 @@ +from .proxy import ObjectProxy \ No newline at end of file diff --git a/hololinked/client/proxy.py b/hololinked/client/proxy.py new file mode 100644 index 0000000..7e5af58 --- /dev/null +++ b/hololinked/client/proxy.py @@ -0,0 +1,832 @@ +import threading +import warnings +import typing +import logging +import uuid +import zmq + +from ..server.constants import JSON, CommonRPC, ServerMessage, ResourceTypes, ZMQ_PROTOCOLS +from ..server.serializers import BaseSerializer +from ..server.data_classes import RPCResource, ServerSentEvent +from ..server.zmq_message_brokers import AsyncZMQClient, SyncZMQClient, EventConsumer, PROXY + + + +class ObjectProxy: + """ + Procedural client for ``RemoteObject``. Once connected to a server, parameters, methods and events are + dynamically populated. Any of the ZMQ protocols of the server is supported. + + Parameters + ---------- + instance_name: str + instance name of the server to connect. + invokation_timeout: float, int + timeout to schedule a method call or parameter read/write in server. execution time wait is controlled by + ``execution_timeout``. When invokation timeout expires, the method is not executed. + execution_timeout: float, int + timeout to return without a reply after scheduling a method call or parameter read/write. This timer starts + ticking only after the method has started to execute. Returning a call before end of execution can lead to + change of state in the server. + load_remote_object: bool, default True + when True, remote object is located and its resources are loaded. Otherwise, only the client is initialised. + protocol: str + ZMQ protocol used to connect to server. Unlike the server, only one can be specified. + **kwargs: + async_mixin: bool, default False + whether to use both synchronous and asynchronous clients. + serializer: BaseSerializer + use a custom serializer, must be same as the serializer supplied to the server. + allow_foreign_attributes: bool, default False + allows local attributes for proxy apart from parameters fetched from the server. + logger: logging.Logger + logger instance + log_level: int + log level corresponding to logging.Logger when internally created + handshake_timeout: int + time in milliseconds to search & handshake server remote object. raises Timeout when expired + """ + + _own_attrs = frozenset([ + '__annotations__', + '_zmq_client', '_async_zmq_client', '_allow_foreign_attributes', + 'identity', 'instance_name', 'logger', 'execution_timeout', 'invokation_timeout', + '_execution_timeout', '_invokation_timeout', '_events', '_noblock_messages' + ]) + + def __init__(self, instance_name : str, protocol : str = ZMQ_PROTOCOLS.IPC, invokation_timeout : float = 5, + load_remote_object = True, **kwargs) -> None: + self._allow_foreign_attributes = kwargs.get('allow_foreign_attributes', False) + self.instance_name = instance_name + self.invokation_timeout = invokation_timeout + self.execution_timeout = kwargs.get("execution_timeout", None) + self.identity = f"{instance_name}|{uuid.uuid4()}" + self.logger = kwargs.pop('logger', logging.Logger(self.identity, level=kwargs.get('log_level', logging.INFO))) + self._noblock_messages = dict() + # compose ZMQ client in Proxy client so that all sending and receiving is + # done by the ZMQ client and not by the Proxy client directly. Proxy client only + # bothers mainly about __setattr__ and _getattr__ + self._async_zmq_client = None + self._zmq_client = SyncZMQClient(instance_name, self.identity, client_type=PROXY, protocol=protocol, + rpc_serializer=kwargs.get('serializer', None), handshake=load_remote_object, + logger=self.logger, **kwargs) + if kwargs.get("async_mixin", False): + self._async_zmq_client = AsyncZMQClient(instance_name, self.identity + '|async', client_type=PROXY, protocol=protocol, + rpc_serializer=kwargs.get('serializer', None), handshake=load_remote_object, + logger=self.logger, **kwargs) + if load_remote_object: + self.load_remote_object() + + def __getattribute__(self, __name: str) -> typing.Any: + obj = super().__getattribute__(__name) + if isinstance(obj, _RemoteParameter): + return obj.get() + return obj + + def __setattr__(self, __name : str, __value : typing.Any) -> None: + if (__name in ObjectProxy._own_attrs or (__name not in self.__dict__ and + isinstance(__value, __allowed_attribute_types__)) or self._allow_foreign_attributes): + # allowed attribute types are _RemoteParameter and _RemoteMethod defined after this class + return super(ObjectProxy, self).__setattr__(__name, __value) + elif __name in self.__dict__: + obj = self.__dict__[__name] + if isinstance(obj, _RemoteParameter): + obj.set(value=__value) + return + raise AttributeError(f"Cannot set attribute {__name} again to ObjectProxy for {self.instance_name}.") + raise AttributeError(f"Cannot set foreign attribute {__name} to ObjectProxy for {self.instance_name}. Given attribute not found in server object.") + + def __repr__(self) -> str: + return f'ObjectProxy {self.identity}' + + def __enter__(self): + return self + + def __exit__(self, exc_type, exc_value, traceback): + del self + + def __bool__(self) -> bool: + try: + self._zmq_client.handshake(num_of_tries=10) + return True + except RuntimeError: + return False + + def __eq__(self, other) -> bool: + if other is self: + return True + return (isinstance(other, ObjectProxy) and other.instance_name == self.instance_name and + other._zmq_client.protocol == self._zmq_client.protocol) + + def __ne__(self, other) -> bool: + if other and isinstance(other, ObjectProxy): + return (other.instance_name != self.instance_name or + other._zmq_client.protocol != self._zmq_client.protocol) + return True + + def __hash__(self) -> int: + return hash(self.identity) + + def get_invokation_timeout(self) -> typing.Union[float, int]: + return self._invokation_timeout + + def set_invokation_timeout(self, value : typing.Union[float, int]) -> None: + if not isinstance(value, (float, int, type(None))): + raise TypeError(f"Timeout can only be float or int greater than 0, or None. Given type {type(value)}.") + elif value is not None and value < 0: + raise ValueError("Timeout must be at least 0 or None, not negative.") + self._invokation_timeout = value + + invokation_timeout = property(fget=get_invokation_timeout, fset=set_invokation_timeout, + doc="Timeout in seconds on server side for invoking a method or read/write parameter. \ + Defaults to 5 seconds and network times not considered." + ) + + def get_execution_timeout(self) -> typing.Union[float, int]: + return self._execution_timeout + + def set_execution_timeout(self, value : typing.Union[float, int]) -> None: + if not isinstance(value, (float, int, type(None))): + raise TypeError(f"Timeout can only be float or int greater than 0, or None. Given type {type(value)}.") + elif value is not None and value < 0: + raise ValueError("Timeout must be at least 0 or None, not negative.") + self._execution_timeout = value + + execution_timeout = property(fget=get_execution_timeout, fset=set_execution_timeout, + doc="Timeout in seconds on server side for execution of method or read/write parameter." + + "Starts ticking after invokation timeout completes." + + "Defaults to None (i.e. waits indefinitely until return) and network times not considered." + ) + + + def invoke(self, method : str, oneway : bool = False, noblock : bool = False, + *args, **kwargs) -> typing.Any: + """ + call a method specified by name on the server with positional/keyword arguments + + Parameters + ---------- + method: str + name of the method + oneway: bool, default False + only send an instruction to invoke the method but do not fetch the reply. + noblock: bool, default False + request a method call but collect the reply later using a reply id + *args: Any + arguments for the method + **kwargs: Dict[str, Any] + keyword arguments for the method + + Returns + ------- + Any + return value of the method call or an id if noblock is True + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + method = getattr(self, method, None) # type: _RemoteMethod + if not isinstance(method, _RemoteMethod): + raise AttributeError(f"No remote method named {method}") + if oneway: + method.oneway(*args, **kwargs) + elif noblock: + msg_id = method.noblock(*args, **kwargs) + self._noblock_messages[msg_id] = method + return msg_id + else: + return method(*args, **kwargs) + + + async def async_invoke(self, method : str, *args, **kwargs) -> typing.Any: + """ + async(io) call a method specified by name on the server with positional/keyword + arguments. noblock and oneway not supported for async calls. + + Parameters + ---------- + method: str + name of the method + *args: Any + arguments for the method + **kwargs: Dict[str, Any] + keyword arguments for the method + + Returns + ------- + Any + return value of the method call + + Raises + ------ + AttributeError: + if no method with specified name found on the server + RuntimeError: + if async_mixin was False at ``__init__()`` - no asynchronous client was created + Exception: + server raised exception are propagated + """ + method = getattr(self, method, None) # type: _RemoteMethod + if not isinstance(method, _RemoteMethod): + raise AttributeError(f"No remote method named {method}") + return await method.async_call(*args, **kwargs) + + + def get_parameter(self, name : str, noblock : bool = False) -> typing.Any: + """ + get parameter specified by name on server. + + Parameters + ---------- + name: str + name of the parameter + noblock: bool, default False + request the parameter get but collect the reply/value later using a reply id + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + parameter = self.__dict__.get(name, None) # type: _RemoteParameter + if not isinstance(parameter, _RemoteParameter): + raise AttributeError(f"No remote parameter named {parameter}") + if noblock: + msg_id = parameter.noblock_get() + self._noblock_messages[msg_id] = parameter + return msg_id + else: + return parameter.get() + + + def set_parameter(self, name : str, value : typing.Any, oneway : bool = False, + noblock : bool = False) -> None: + """ + set parameter specified by name on server with specified value. + + Parameters + ---------- + name: str + name of the parameter + value: Any + value of parameter to be set + oneway: bool, default False + only send an instruction to set the parameter but do not fetch the reply. + (irrespective of whether set was successful or not) + noblock: bool, default False + request the set parameter but collect the reply later using a reply id + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + parameter = self.__dict__.get(name, None) # type: _RemoteParameter + if not isinstance(parameter, _RemoteParameter): + raise AttributeError(f"No remote parameter named {parameter}") + if oneway: + parameter.oneway_set(value) + elif noblock: + msg_id = parameter.noblock_set(value) + self._noblock_messages[msg_id] = parameter + return msg_id + else: + parameter.set(value) + + + async def async_get_parameter(self, name : str) -> None: + """ + async(io) get parameter specified by name on server. + + Parameters + ---------- + name: Any + name of the parameter to fetch + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + parameter = self.__dict__.get(name, None) # type: _RemoteParameter + if not isinstance(parameter, _RemoteParameter): + raise AttributeError(f"No remote parameter named {parameter}") + return await parameter.async_get() + + + async def async_set_parameter(self, name : str, value : typing.Any) -> None: + """ + async(io) set parameter specified by name on server with specified value. + noblock and oneway not supported for async calls. + + Parameters + ---------- + name: str + name of the parameter + value: Any + value of parameter to be set + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + parameter = self.__dict__.get(name, None) # type: _RemoteParameter + if not isinstance(parameter, _RemoteParameter): + raise AttributeError(f"No remote parameter named {parameter}") + await parameter.async_set(value) + + + def get_parameters(self, names : typing.List[str], noblock : bool = False) -> typing.Any: + """ + get parameters specified by list of names. + + Parameters + ---------- + names: List[str] + names of parameters to be fetched + noblock: bool, default False + request the fetch but collect the reply later using a reply id + + Returns + ------- + Dict[str, Any]: + dictionary with names as keys and values corresponding to those keys + """ + method = getattr(self, '_get_parameters', None) # type: _RemoteMethod + if not method: + raise RuntimeError("Client did not load server resources correctly. Report issue at github.") + if noblock: + msg_id = method.noblock(names=names) + self._noblock_messages[msg_id] = method + return msg_id + else: + return method(names=names) + + + def set_parameters(self, values : typing.Dict[str, typing.Any], oneway : bool = False, + noblock : bool = False) -> None: + """ + set parameters whose name is specified by keys of a dictionary + + Parameters + ---------- + values: Dict[str, Any] + name and value of parameters to be set + oneway: bool, default False + only send an instruction to set the parameter but do not fetch the reply. + (irrespective of whether set was successful or not) + noblock: bool, default False + request the set parameter but collect the reply later using a reply id + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + if not isinstance(values, dict): + raise ValueError("set_parameters values must be dictionary with parameter names as key") + method = getattr(self, '_set_parameters', None) # type: _RemoteMethod + if not method: + raise RuntimeError("Client did not load server resources correctly. Report issue at github.") + if oneway: + method.oneway(values=values) + elif noblock: + msg_id = method.noblock(values=values) + self._noblock_messages[msg_id] = method + return msg_id + else: + return method(values=values) + + + async def async_get_parameters(self, names) -> None: + """ + async(io) get parameters specified by list of names. no block gets are not supported for asyncio. + + Parameters + ---------- + names: List[str] + names of parameters to be fetched + + Returns + ------- + Dict[str, Any]: + dictionary with parameter names as keys and values corresponding to those keys + """ + method = getattr(self, '_get_parameters', None) # type: _RemoteMethod + if not method: + raise RuntimeError("Client did not load server resources correctly. Report issue at github.") + return await method.async_call(names=names) + + + async def async_set_parameters(self, **parameters) -> None: + """ + async(io) set parameters whose name is specified by keys of a dictionary + + Parameters + ---------- + values: Dict[str, Any] + name and value of parameters to be set + + Raises + ------ + AttributeError: + if no method with specified name found on the server + Exception: + server raised exception are propagated + """ + method = getattr(self, '_set_parameters', None) # type: _RemoteMethod + if not method: + raise RuntimeError("Client did not load server resources correctly. Report issue at github.") + await method.async_call(**parameters) + + + def subscribe_event(self, name : str, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable], + thread_callbacks : bool = False) -> None: + """ + Subscribe to event specified by name. Events are listened in separate threads and supplied callbacks are + are also called in those threads. + + Parameters + ---------- + name: str + name of the event, either the object name used in the server or the name specified in the name argument of + the Event object + callbacks: Callable | List[Callable] + one or more callbacks that will be executed when this event is received + thread_callbacks: bool + thread the callbacks otherwise the callbacks will be executed serially + + Raises + ------ + AttributeError: + if no event with specified name is found + """ + event = getattr(self, name, None) # type: _Event + if not isinstance(event, _Event): + raise AttributeError(f"No event named {name}") + if event._subscribed: + event.add_callbacks(callbacks) + else: + event.subscribe(callbacks, thread_callbacks) + + + def unsubscribe_event(self, name : str): + """ + Unsubscribe to event specified by name. + + Parameters + ---------- + name: str + name of the event + callbacks: Callable | List[Callable] + one or more callbacks that will be executed when this event is received + thread_callbacks: bool + thread the callbacks otherwise the callbacks will be executed serially + + Raises + ------ + AttributeError: + if no event with specified name is found + """ + event = getattr(self, name, None) # type: _Event + if not isinstance(event, _Event): + raise AttributeError(f"No event named {name}") + event.unsubscribe() + + + def load_remote_object(self): + """ + Get exposed resources from server (methods, parameters, events) and remember them as attributes of the proxy. + """ + fetch = _RemoteMethod(self._zmq_client, CommonRPC.rpc_resource_read(instance_name=self.instance_name), + invokation_timeout=self._invokation_timeout) # type: _RemoteMethod + reply = fetch() # type: typing.Dict[str, typing.Dict[str, typing.Any]] + + for name, data in reply.items(): + if isinstance(data, dict): + try: + if data["what"] == ResourceTypes.EVENT: + data = ServerSentEvent(**data) + else: + data = RPCResource(**data) + except Exception as ex: + ex.add_note("Did you correctly configure your serializer? " + + "This exception occurs when given serializer does not work the same way as server serializer") + raise ex from None + elif not isinstance(data, (RPCResource, ServerSentEvent)): + raise RuntimeError("Logic error - deserialized info about server not instance of hololinked.server.data_classes.RPCResource") + if data.what == ResourceTypes.CALLABLE: + _add_method(self, _RemoteMethod(self._zmq_client, data.instruction, self.invokation_timeout, + self.execution_timeout, data.argument_schema, self._async_zmq_client), data) + elif data.what == ResourceTypes.PARAMETER: + _add_parameter(self, _RemoteParameter(self._zmq_client, data.instruction, self.invokation_timeout, + self.execution_timeout, self._async_zmq_client), data) + elif data.what == ResourceTypes.EVENT: + assert isinstance(data, ServerSentEvent) + event = _Event(self._zmq_client, data.name, data.obj_name, data.unique_identifier, data.socket_address, + serializer=self._zmq_client.rpc_serializer, logger=self.logger) + _add_event(self, event, data) + self.__dict__[data.name] = event + + + def read_reply(self, message_id : bytes, timeout : typing.Optional[float] = 5000) -> typing.Any: + obj = self._noblock_messages.get(message_id, None) + if not obj: + raise ValueError('given message id not a one way call or invalid.') + reply = self._zmq_client._reply_cache.get(message_id, None) + if not reply: + reply = self._zmq_client.recv_reply(message_id=message_id, timeout=timeout, + raise_client_side_exception=True) + if not reply: + raise ReplyNotArrivedError(f"could not fetch reply within timeout for message id '{message_id}'") + if isinstance(obj, _RemoteMethod): + obj._last_return_value = reply + return obj.last_return_value # note the missing underscore + elif isinstance(obj, _RemoteParameter): + obj._last_value = reply + return obj.last_read_value + + +# SM = Server Message +SM_INDEX_ADDRESS = ServerMessage.ADDRESS.value +SM_INDEX_SERVER_TYPE = ServerMessage.SERVER_TYPE.value +SM_INDEX_MESSAGE_TYPE = ServerMessage.MESSAGE_TYPE.value +SM_INDEX_MESSAGE_ID = ServerMessage.MESSAGE_ID.value +SM_INDEX_DATA = ServerMessage.DATA.value +SM_INDEX_ENCODED_DATA = ServerMessage.ENCODED_DATA.value + +class _RemoteMethod: + + __slots__ = ['_zmq_client', '_async_zmq_client', '_instruction', '_invokation_timeout', '_execution_timeout', + '_schema', '_last_return_value', '__name__', '__qualname__', '__doc__'] + # method call abstraction + # Dont add doc otherwise __doc__ in slots will conflict with class variable + + def __init__(self, sync_client : SyncZMQClient, instruction : str, invokation_timeout : typing.Optional[float] = 5, + execution_timeout : typing.Optional[float] = None, argument_schema : typing.Optional[JSON] = None, + async_client : typing.Optional[AsyncZMQClient] = None) -> None: + """ + Parameters + ---------- + sync_client: SyncZMQClient + synchronous ZMQ client + async_zmq_client: AsyncZMQClient + asynchronous ZMQ client for async calls + instruction: str + The instruction needed to call the method + """ + self._zmq_client = sync_client + self._async_zmq_client = async_client + self._instruction = instruction + self._invokation_timeout = invokation_timeout + self._execution_timeout = execution_timeout + self._schema = argument_schema + + @property # i.e. cannot have setter + def last_return_value(self): + """ + cached return value of the last call to the method + """ + if len(self._last_return_value[SM_INDEX_ENCODED_DATA]) > 0: + return self._last_return_value[SM_INDEX_ENCODED_DATA] + return self._last_return_value[SM_INDEX_DATA] + + @property + def last_zmq_message(self) -> typing.List: + return self._last_return_value + + def __call__(self, *args, **kwargs) -> typing.Any: + """ + execute method on server + """ + if len(args) > 0: + kwargs["__args__"] = args + self._last_return_value = self._zmq_client.execute(instruction=self._instruction, arguments=kwargs, + invokation_timeout=self._invokation_timeout, execution_timeout=self._execution_timeout, + raise_client_side_exception=True, argument_schema=self._schema) + return self.last_return_value # note the missing underscore + + def oneway(self, *args, **kwargs) -> None: + """ + only issues the method call to the server and does not wait for reply, + neither does the server reply to this call. + """ + if len(args) > 0: + kwargs["__args__"] = args + self._zmq_client.send_instruction(instruction=self._instruction, arguments=kwargs, + invokation_timeout=self._invokation_timeout, execution_timeout=None, + context=dict(oneway=True), argument_schema=self._schema) + + def noblock(self, *args, **kwargs) -> None: + if len(args) > 0: + kwargs["__args__"] = args + return self._zmq_client.send_instruction(instruction=self._instruction, arguments=kwargs, + invokation_timeout=self._invokation_timeout, execution_timeout=self._execution_timeout, + argument_schema=self._schema) + + async def async_call(self, *args, **kwargs): + """ + async execute method on server + """ + if not self._async_zmq_client: + raise RuntimeError("async calls not possible as async_mixin was not set at __init__()") + if len(args) > 0: + kwargs["__args__"] = args + self._last_return_value = await self._async_zmq_client.async_execute(instruction=self._instruction, + arguments=kwargs, invokation_timeout=self._invokation_timeout, raise_client_side_exception=True, + argument_schema=self._schema) + return self.last_return_value # note the missing underscore + + +class _RemoteParameter: + + __slots__ = ['_zmq_client', '_async_zmq_client', '_read_instruction', '_write_instruction', + '_invokation_timeout', '_execution_timeout', '_last_value', '__name__', '__doc__'] + # parameter get set abstraction + # Dont add doc otherwise __doc__ in slots will conflict with class variable + + def __init__(self, client : SyncZMQClient, instruction : str, invokation_timeout : typing.Optional[float] = 5, + execution_timeout : typing.Optional[float] = None, async_client : typing.Optional[AsyncZMQClient] = None) -> None: + self._zmq_client = client + self._async_zmq_client = async_client + self._invokation_timeout = invokation_timeout + self._execution_timeout = execution_timeout + self._read_instruction = instruction + '/read' + self._write_instruction = instruction + '/write' + + @property # i.e. cannot have setter + def last_read_value(self) -> typing.Any: + """ + cache of last read value + """ + if len(self._last_value[SM_INDEX_ENCODED_DATA]) > 0: + return self._last_value[SM_INDEX_ENCODED_DATA] + return self._last_value[SM_INDEX_DATA] + + @property + def last_zmq_message(self) -> typing.List: + """ + cache of last message received for this parameter + """ + return self._last_value + + def set(self, value : typing.Any) -> None: + self._last_value = self._zmq_client.execute(self._write_instruction, dict(value=value), + raise_client_side_exception=True) + + def get(self) -> typing.Any: + self._last_value = self._zmq_client.execute(self._read_instruction, + invokation_timeout=self._invokation_timeout, + raise_client_side_exception=True) + return self.last_read_value + + async def async_set(self, value : typing.Any) -> None: + if not self._async_zmq_client: + raise RuntimeError("async calls not possible as async_mixin was not set at __init__()") + self._last_value = await self._async_zmq_client.async_execute(self._write_instruction, dict(value=value), + invokation_timeout=self._invokation_timeout, + execution_timeout=self._execution_timeout, + raise_client_side_exception=True) + + async def async_get(self) -> typing.Any: + if not self._async_zmq_client: + raise RuntimeError("async calls not possible as async_mixin was not set at __init__()") + self._last_value = await self._async_zmq_client.async_execute(self._read_instruction, + invokation_timeout=self._invokation_timeout, + execution_timeout=self._execution_timeout, + raise_client_side_exception=True) + return self.last_read_value + + def noblock_get(self) -> None: + return self._zmq_client.send_instruction(self._read_instruction, + invokation_timeout=self._invokation_timeout, + execution_timeout=self._execution_timeout) + + def noblock_set(self, value : typing.Any) -> None: + return self._zmq_client.send_instruction(self._write_instruction, dict(value=value), + invokation_timeout=self._invokation_timeout, + execution_timeout=self._execution_timeout) + + def oneway_set(self, value : typing.Any) -> None: + self._zmq_client.send_instruction(self._write_instruction, dict(value=value), + invokation_timeout=self._invokation_timeout, + execution_timeout=self._execution_timeout) + + + +class _Event: + + __slots__ = ['_zmq_client', '_name', '_obj_name', '_unique_identifier', '_socket_address', '_callbacks', + '_serializer', '_subscribed', '_thread', '_thread_callbacks', '_event_consumer', '_logger'] + # event subscription + # Dont add class doc otherwise __doc__ in slots will conflict with class variable + + def __init__(self, client : SyncZMQClient, name : str, obj_name : str, unique_identifier : str, socket : str, + serializer : BaseSerializer = None, logger : logging.Logger = None) -> None: + self._name = name + self._obj_name = obj_name + self._unique_identifier = unique_identifier + self._socket_address = socket + self._callbacks = None + self._serializer = serializer + self._logger = logger + + def add_callbacks(self, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable]) -> None: + if not self._callbacks: + self._callbacks = [] + if isinstance(callbacks, list): + self._callbacks.extend(callbacks) + else: + self._callbacks.append(callbacks) + + def subscribe(self, callbacks : typing.Union[typing.List[typing.Callable], typing.Callable], + thread_callbacks : bool = False): + self._event_consumer = EventConsumer(self._unique_identifier, self._socket_address, + f"{self._name}|RPCEvent|{uuid.uuid4()}", b'PROXY', + rpc_serializer=self._serializer, logger=self._logger) + self.add_callbacks(callbacks) + self._subscribed = True + self._thread_callbacks = thread_callbacks + self._thread = threading.Thread(target=self.listen) + self._thread.start() + + def listen(self): + while self._subscribed: + try: + data = self._event_consumer.receive() + if data == 'INTERRUPT': + break + for cb in self._callbacks: + if not self._thread_callbacks: + cb(data) + else: + threading.Thread(target=cb, args=(data,)).start() + except Exception as ex: + warnings.warn(f"Uncaught exception from {self._name} event - {str(ex)}", + category=RuntimeWarning) + try: + self._event_consumer.exit() + except: + pass + + + def unsubscribe(self, join_thread : bool = True): + self._subscribed = False + self._event_consumer.interrupt() + if join_thread: + self._thread.join() + + + + + +__allowed_attribute_types__ = (_RemoteParameter, _RemoteMethod, _Event) +__WRAPPER_ASSIGNMENTS__ = ('__name__', '__qualname__', '__doc__') + +def _add_method(client_obj : ObjectProxy, method : _RemoteMethod, func_info : RPCResource) -> None: + if not func_info.top_owner: + return + raise RuntimeError("logic error") + for dunder in __WRAPPER_ASSIGNMENTS__: + if dunder == '__qualname__': + info = '{}.{}'.format(client_obj.__class__.__name__, func_info.get_dunder_attr(dunder).split('.')[1]) + else: + info = func_info.get_dunder_attr(dunder) + setattr(method, dunder, info) + client_obj.__setattr__(func_info.obj_name, method) + +def _add_parameter(client_obj : ObjectProxy, parameter : _RemoteParameter, parameter_info : RPCResource) -> None: + if not parameter_info.top_owner: + return + raise RuntimeError("logic error") + for attr in ['__doc__', '__name__']: + # just to imitate _add_method logic + setattr(parameter, attr, parameter_info.get_dunder_attr(attr)) + client_obj.__setattr__(parameter_info.obj_name, parameter) + +def _add_event(client_obj : ObjectProxy, event : _Event, event_info : ServerSentEvent) -> None: + setattr(client_obj, event_info.obj_name, event) + + + +class ReplyNotArrivedError(Exception): + pass + + +__all__ = ['ObjectProxy'] + diff --git a/hololinked/param/__init__.py b/hololinked/param/__init__.py index 1fff831..ea43c62 100644 --- a/hololinked/param/__init__.py +++ b/hololinked/param/__init__.py @@ -53,7 +53,7 @@ """ from . import exceptions from .parameterized import (Parameterized, ParameterizedFunction, ParamOverrides, Parameter, - depends_on, instance_descriptor, discard_events, edit_constant, ) + depends_on, instance_descriptor, discard_events, edit_constant) from .logger import get_logger, logging_level, VERBOSE diff --git a/hololinked/param/copy_parameters.py b/hololinked/param/copy_parameters.py new file mode 100644 index 0000000..ab83adf --- /dev/null +++ b/hololinked/param/copy_parameters.py @@ -0,0 +1,112 @@ + + + + + +def copy_parameters(src : str = 'D:/onedrive/desktop/dashboard/scada/scadapy/scadapy/param/parameters.py', + dst : str = 'D:/onedrive/desktop/dashboard/scada/scadapy/scadapy/server/remote_parameters.py') -> None: + + skip_classes = ['Infinity', 'resolve_path', 'normalize_path', 'BaseConstrainedList', 'TypeConstrainedList', + 'TypeConstrainedDict', 'TypedKeyMappingsConstrainedDict', 'Event'] + end_line = 'def hashable' + additional_imports = ['from ..param.parameters import (TypeConstrainedList, TypeConstrainedDict, abbreviate_paths,\n', + ' TypedKeyMappingsConstrainedDict, resolve_path, concrete_descendents, named_objs)\n' + 'from .remote_parameter import RemoteParameter\n', + 'from .constants import HTTP, PROXY, USE_OBJECT_NAME, GET, PUT'] + + def fetch_line() -> typing.Generator[str]: + with open(src, 'r') as file: + oldlines = file.readlines() + for line in oldlines: + yield line + + remote_init_kwargs = [ + '\t\t\tURL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT),\n', + '\t\t\tstate : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None,\n', + '\t\t\tdb_persist : bool = False, db_init : bool = False, db_commit : bool = False,\n' + '\t\t\taccess_type : str = (HTTP, PROXY),\n', + ] + + remote_super_init = [ + '\t\t\tURL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist,\n', + '\t\t\tdb_init=db_init, db_commit=db_commit, access_type=access_type)\n' + ] + + common_linegen = fetch_line() + newlines = [] + + def skip_to_init_doc(): + for line in common_linegen: + if 'doc : typing.Optional[str] = None' in line: + return line + else: + newlines.append(line) + + def skip_to_super_init_end(): + for line in common_linegen: + if 'precedence=precedence)' in line: + return line + else: + newlines.append(line) + + def is_function(line : str) -> bool: + if 'def ' in line and 'self' not in line and 'cls' not in line and 'obj' not in line: + return True + return False + + def next_line_after_skip_class_or_function() -> str: + for line_ in common_linegen: + if ('class ' in line_ and ':' in line_) or is_function(line_): + return line_ + + def process_current_line(line : str): + newline = line + if 'import ' in line and 'parameterized ' in line: + newlines_ = [line.replace('from .parameterized', 'from ..param.parameterized').replace('ParamOverrides,', ''), + next(common_linegen).replace('ParameterizedFunction, descendents,', ''), + *additional_imports] + newlines.extend(newlines_) + return + elif 'from collections import OrderedDict' in line: + newlines.append('from enum import Enum\n') + elif 'from .utils' in line or 'from .exceptions' in line: + newline = line.replace('from .', 'from ..param.') + elif 'class ' in line and ':' in line and line.startswith('class'): + if '(Parameter):' in line: + newline = line.replace('(Parameter):', '(RemoteParameter):') + newlines.append(newline) + else: + classname_with_inheritance = line.split(' ', 1)[1][:-2] # [:-2] for removing colon + classname_without_inheritance = classname_with_inheritance.split('(', 1)[0] + if classname_without_inheritance in skip_classes: + newline = next_line_after_skip_class_or_function() + process_current_line(newline) + return + else: + newlines.append(line) + newline = skip_to_init_doc() + newlines.append(newline) + newlines.extend(remote_init_kwargs) + newline = skip_to_super_init_end() + if newline: + newline = newline.replace('precedence=precedence)', 'precedence=precedence,') + newlines.append(newline) + newlines.extend(remote_super_init) + return + elif 'Parameter.__init__' in line: + newline = line.replace('Parameter.__init__', 'RemoteParameter.__init__') + elif is_function(line): + newline = next_line_after_skip_class_or_function() + process_current_line(newline) + return + newlines.append(newline) + + + for line in common_linegen: + process_current_line(line) + if end_line in line: + newlines.pop() + break + + with open(dst, 'w') as file: + file.writelines(newlines) \ No newline at end of file diff --git a/hololinked/param/parameterized.py b/hololinked/param/parameterized.py index 974a8ed..e8f83e0 100644 --- a/hololinked/param/parameterized.py +++ b/hololinked/param/parameterized.py @@ -204,7 +204,7 @@ class Foo(Bar): __slots__ = ['default', 'doc', 'constant', 'readonly', 'allow_None', 'per_instance_descriptor', 'deepcopy_default', 'class_member', 'precedence', - 'owner', 'name', '_internal_name', 'watchers', 'overloads', + 'owner', 'name', '_internal_name', 'watchers', 'fget', 'fset', 'fdel', '_disable_post_slot_set'] # Note: When initially created, a Parameter does not know which @@ -297,9 +297,10 @@ class hierarchy (see ParameterizedMetaclass). self.deepcopy_default = deepcopy_default self.class_member = class_member self.precedence = precedence - self.watchers : typing.Dict[str, typing.List] = {} - self.overloads : typing.Dict[str, typing.Union[typing.Callable, None]] = dict(fget=fget, - fset=fset, fdel=fdel) + self.watchers = {} # typing.Dict[str, typing.List] + self.fget = fget # type: typing.Union[typing.Callable, None] + self.fset = fset # type: typing.Union[typing.Callable, None] + self.fdel = fdel # type: typing.Union[typing.Callable, None] def __set_name__(self, owner : typing.Any, attrib_name : str) -> None: self._internal_name = f"_{attrib_name}_param_value" @@ -347,7 +348,7 @@ def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> No """ # __parent_slots__ attribute is needed for entry into this function correctly otherwise # slot_attribute in __setattr__ will have wrong boolean flag - if slot == 'owner' and self.owner is not None: + if slot == 'owner' and self.owner is not None and self.fget is None: with disable_post_slot_set(self): self.default = self.validate_and_adapt(self.default) @@ -366,9 +367,8 @@ def __get__(self, obj : typing.Union['Parameterized', typing.Any], """ if obj is None: return self - fget = self.overloads['fget'] - if fget is not None: - return fget(obj) + if self.fget is not None: + return self.fget(obj) return obj.__dict__.get(self._internal_name, self.default) @instance_descriptor @@ -414,9 +414,8 @@ def __set__(self, obj : typing.Union['Parameterized', typing.Any], value : typin old = obj.__dict__.get(self._internal_name, self.default) # The following needs to be optimised, probably through lambda functions? - fset = self.overloads['fset'] - if fset is not None: - fset(obj, value) + if self.fset is not None: + self.fset(obj, value) else: obj.__dict__[self._internal_name] = value @@ -451,10 +450,9 @@ def __set__(self, obj : typing.Union['Parameterized', typing.Any], value : typin def validate_and_adapt(self, value : typing.Any) -> typing.Any: """ - modify the given value if a proper logical reasoning can be given. - returns modified value. Should not be mostly used unless the data stored is quite complex by structure. + Validate the given value and adapt it if a proper logical reasoning can be given, for example, cropping a number + to its bounds. Returns modified value. """ - # raise NotImplementedError("overload this function in child class to validate your value and adapt it if necessary.") return value def _post_value_set(self, obj : typing.Union['Parameterized', typing.Any], value : typing.Any) -> None: @@ -492,17 +490,29 @@ def __setstate__(self, state : typing.Dict[str, typing.Any]): setattr(self,k,v) def getter(self, func : typing.Callable) -> typing.Callable: - self.overloads['fget'] = func + """ + Register a getter method by using this as a decorator. + """ + self.fget = func return func def setter(self, func : typing.Callable) -> typing.Callable: - self.overloads['fset'] = func + """ + Register a setter method by using this as a decorator. Getters are mandatory if setter is defined. + """ + self.fset = func return func def deleter(self, func : typing.Callable) -> typing.Callable: - self.overloads['fdel'] = func + """ + Register a deleter method by using this as a decorator. Getters & Setters are mandatory if deleter is defined. + """ + self.fdel = func return func + def __call__(self, func: typing.Callable) -> "Parameter": + return self.getter(func) + @classmethod def serialize(cls, value : typing.Any) -> typing.Any: "Given the parameter value, return a Python value suitable for serialization" @@ -1716,6 +1726,10 @@ def __init__(mcs, name : str, bases : typing.Tuple, dict_ : dict) -> None: mcs._update_docstring_signature(dict_.get('parameterized_docstring_signature', False)) def _create_param_container(mcs, mcs_members : dict): + """ + overridable in ``Parameterized`` child if a subclass of ``Parameters`` + was created by user. + """ mcs._param_container = ClassParameters(mcs, mcs_members) # value return when accessing cls/self.param @property diff --git a/hololinked/rpc/__init__.py b/hololinked/rpc/__init__.py new file mode 100644 index 0000000..b68297c --- /dev/null +++ b/hololinked/rpc/__init__.py @@ -0,0 +1,16 @@ +""" +namespace for pure RPC applications to auto suggest the user. The following were the original names +given until the WoT schema was being generated. The docs then got totally mixed up with two types +of names and therefore the WoT naming was taken as final. +""" +import typing +from enum import Enum +from ..server import Thing as RemoteObject, action +from ..server.constants import USE_OBJECT_NAME, HTTP_METHODS, JSON + + +def remote_method(URL_path : str = USE_OBJECT_NAME, http_method : str = HTTP_METHODS.POST, + state : typing.Optional[typing.Union[str, Enum]] = None, argument_schema : typing.Optional[JSON] = None, + return_value_schema : typing.Optional[JSON] = None) -> typing.Callable: + return action(URL_path=URL_path, http_method=http_method, state=state, argument_schema=argument_schema, + return_value_schema=return_value_schema) \ No newline at end of file diff --git a/hololinked/server/HTTPServer.py b/hololinked/server/HTTPServer.py index 45c6c06..c32f621 100644 --- a/hololinked/server/HTTPServer.py +++ b/hololinked/server/HTTPServer.py @@ -1,279 +1,311 @@ +import asyncio +import zmq +import zmq.asyncio import logging +import socket import ssl -import asyncio -from typing import Dict, List, Callable, Union, Any -from multiprocessing import Process +import typing +from tornado import ioloop from tornado.web import Application -from tornado.routing import Router from tornado.httpserver import HTTPServer as TornadoHTTP1Server +from tornado.httpclient import AsyncHTTPClient, HTTPRequest # from tornado_http2.server import Server as TornadoHTTP2Server -from tornado import ioloop -from tornado.httputil import HTTPServerRequest -from time import perf_counter from ..param import Parameterized -from ..param.parameters import Integer, IPAddress, ClassSelector, Selector, TypedList, Boolean, String - - -from .utils import create_default_logger -from .decorators import get, put, post, delete, remote_method -from .data_classes import HTTPServerResourceData +from ..param.parameters import (Integer, IPAddress, ClassSelector, Selector, TypedList, String) +from .constants import CommonRPC, HTTPServerTypes, ResourceTypes, ServerMessage +from .utils import get_IP_from_interface +from .data_classes import HTTPResource, ServerSentEvent +from .utils import get_default_logger, run_coro_sync from .serializers import JSONSerializer -from .constants import GET, PUT, POST, OPTIONS, DELETE, USE_OBJECT_NAME, CALLABLE -from .webserver_utils import log_resources, log_request, update_resources -from .zmq_message_brokers import MessageMappedZMQClientPool -from .handlers import (BaseRequestHandler, GetResource, PutResource, OptionsResource, - PostResource, DeleteResource, FileHandlerResource) -from .remote_object import RemoteObject, RemoteObjectDB -from .eventloop import Consumer -from .host_utilities import HTTPServerUtilities, SERVER_INSTANCE_NAME - - -asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) - - -class CustomRouter(Router): - - def __init__(self, app : Application, logger : logging.Logger, IP : str, - file_server_paths) -> None: - self.app = app - self.logger = logger - self.IP = IP - self.logger.info('started webserver at {}, ready to receive requests.'.format(self.IP)) - self.file_server_paths = file_server_paths - - def find_handler(self, request : HTTPServerRequest): - - start_time = perf_counter() - log_request(request, self.logger) - - if (request.method == GET): - for path in self.file_server_paths["STATIC_ROUTES"].keys(): - if request.path.startswith(path): - print("static handler") - return self.app.get_handler_delegate(request, FileHandlerResource, - target_kwargs=dict( - path=self.file_server_paths["STATIC_ROUTES"][path].directory - ), - path_args=[bytes(request.path.split('/')[-1], encoding='utf-8')]) - handler = GetResource - elif (request.method == POST): - handler = PostResource - elif (request.method == PUT): - handler = PutResource - elif (request.method == DELETE): - handler = DeleteResource - elif (request.method == OPTIONS): - handler = OptionsResource - else: - handler = OptionsResource - - return self.app.get_handler_delegate(request, handler, - target_kwargs=dict( - client_address='*', - start_time=start_time - )) - +from .database import ThingInformation +from .zmq_message_brokers import AsyncZMQClient, MessageMappedZMQClientPool +from .handlers import RPCHandler, BaseHandler, EventHandler, ThingsHandler -__http_methods__ = { - GET : get, - POST : post, - PUT : put, - DELETE : delete -} class HTTPServer(Parameterized): - - address = IPAddress( default = '0.0.0.0', - doc = "set custom IP address, default is localhost (0.0.0.0)" ) - port = Integer ( default = 8080, bounds = (0, 65535), - doc = "the port at which the server should be run (unique)" ) - protocol_version = Selector ( objects = [1.1, 2], default = 2, - doc = """for HTTP 2, SSL is mandatory. HTTP2 is recommended. - When no SSL configurations are provided, defaults to 1.1""" ) - logger = ClassSelector ( class_ = logging.Logger, default = None, allow_None = True, - doc = "Supply a custom logger here or set log_level parameter to a valid value" ) - log_level = Selector ( objects = [logging.DEBUG, logging.INFO, logging.ERROR, logging.CRITICAL, logging.ERROR], - default = logging.INFO, - doc = "Alternative to logger, this creates an internal logger with the specified log level" ) - consumers = TypedList ( item_type = (RemoteObject, Consumer, str), default = None, allow_None = True, - doc = "Remote Objects to be served by the HTTP server" ) - subscription = String ( default = None, allow_None = True, - doc = "Host Server to subscribe to coordinate starting sequence of remote objects & web GUI" ) - json_serializer = ClassSelector ( class_ = JSONSerializer, default = None, allow_None = True, - doc = "optionally, supply your own JSON serializer for custom types" ) - ssl_context = ClassSelector ( class_ = ssl.SSLContext , default = None, allow_None = True, - doc = "use it for highly customized SSL context to provide encrypted communication" ) - certfile = String ( default = None, allow_None = True, - doc = """alternative to SSL context, provide certificate file & key file to allow the server - to create a SSL connection on its own""" ) - keyfile = String ( default = None, allow_None = True, - doc = """alternative to SSL context, provide certificate file & key file to allow the server - to create a SSL connection on its own""" ) - server_network_interface = String ( default = 'Ethernet', - doc = """Currently there is no logic to detect the IP addresss (as externally visible) correctly, therefore - please send the network interface name to retrieve the IP. If a DNS server is present or , you may leave - this field""" ) - - def __init__(self, consumers = None, port = 8080, address = '0.0.0.0', subscription = None, logger = None, - log_level = logging.INFO, certfile = None, keyfile = None, json_serializer = None, ssl_context = None, - protocol_version = 2 ) -> None: + """ + HTTP(s) server to route requests to ``Thing``. + """ + + things = TypedList(item_type=str, default=None, allow_None=True, + doc="instance name of the things to be served by the HTTP server." ) # type: typing.List[str] + port = Integer(default=8080, bounds=(1, 65535), + doc="the port at which the server should be run" ) # type: int + address = IPAddress(default='0.0.0.0', + doc="IP address") # type: str + # protocol_version = Selector(objects=[1, 1.1, 2], default=2, + # doc="for HTTP 2, SSL is mandatory. HTTP2 is recommended. \ + # When no SSL configurations are provided, defaults to 1.1" ) # type: float + logger = ClassSelector(class_=logging.Logger, default=None, allow_None=True, + doc="logging.Logger" ) # type: logging.Logger + log_level = Selector(objects=[logging.DEBUG, logging.INFO, logging.ERROR, logging.CRITICAL, logging.ERROR], + default=logging.INFO, + doc="""alternative to logger, this creates an internal logger with the specified log level + along with a IO stream handler.""" ) # type: int + serializer = ClassSelector(class_=JSONSerializer, default=None, allow_None=True, + doc="""json serializer used by the server""" ) # type: JSONSerializer + ssl_context = ClassSelector(class_=ssl.SSLContext, default=None, allow_None=True, + doc="SSL context to provide encrypted communication") # type: typing.Optional[ssl.SSLContext] + certfile = String(default=None, allow_None=True, + doc="""alternative to SSL context, provide certificate file & key file to allow the server to + create a SSL context""") # type: str + keyfile = String(default=None, allow_None=True, + doc="""alternative to SSL context, provide certificate file & key file to allow the server to + create a SSL context""") # type: str + allowed_clients = TypedList(item_type=str, + doc="""Serves request and sets CORS only from these clients, other clients are rejected with 403. + Unlike pure CORS, the server resource is not even executed if the client is not + an allowed client. if None any client is served.""") + host = String(default=None, allow_None=True, + doc="Host Server to subscribe to coordinate starting sequence of remote objects & web GUI" ) # type: str + # network_interface = String(default='Ethernet', + # doc="Currently there is no logic to detect the IP addresss (as externally visible) correctly, \ + # therefore please send the network interface name to retrieve the IP. If a DNS server is present, \ + # you may leave this field" ) # type: str + request_handler = ClassSelector(default=RPCHandler, class_=RPCHandler, isinstance=False, + doc="custom web request handler of your choice for property read-write & action execution" ) # type: typing.Union[BaseHandler, RPCHandler] + event_handler = ClassSelector(default=EventHandler, class_=(EventHandler, BaseHandler), isinstance=False, + doc="custom event handler of your choice for handling events") # type: typing.Union[BaseHandler, EventHandler] + + def __init__(self, things : typing.List[str], *, port : int = 8080, address : str = '0.0.0.0', + host : typing.Optional[str] = None, logger : typing.Optional[logging.Logger] = None, log_level : int = logging.INFO, + serializer : typing.Optional[JSONSerializer] = None, ssl_context : typing.Optional[ssl.SSLContext] = None, + certfile : str = None, keyfile : str = None, # protocol_version : int = 1, network_interface : str = 'Ethernet', + allowed_clients : typing.Optional[typing.Union[str, typing.Iterable[str]]] = None, + **kwargs) -> None: + """ + Parameters + ---------- + things: List[str] + instance name of the things to be served as a list. + port: int, default 8080 + the port at which the server should be run + address: str, default 0.0.0.0 + IP address + logger: logging.Logger, optional + logging.Logger instance + log_level: int + alternative to logger, this creates an internal logger with the specified log level along with a IO stream handler. + serializer: JSONSerializer, optional + json serializer used by the server + ssl_context: ssl.SSLContext + SSL context to provide encrypted communication + certfile: str + alternative to SSL context, provide certificate file & key file to allow the server to create a SSL context + keyfile: str + alternative to SSL context, provide certificate file & key file to allow the server to create a SSL context + allowed_clients: List[str] + serves request and sets CORS only from these clients, other clients are reject with 403. Unlike pure CORS + feature, the server resource is not even executed if the client is not an allowed client. + **kwargs: + rpc_handler: RPCHandler | BaseHandler, optional + custom web request handler of your choice for property read-write & action execution + event_handler: EventHandler | BaseHandler, optional + custom event handler of your choice for handling events + """ super().__init__( - consumers = consumers, - port = port, - address = address, - subscription = subscription, - logger = logger, - log_level = log_level, - json_serializer = json_serializer, - protocol_version = protocol_version, - certfile = certfile, - keyfile = keyfile, - ssl_context = ssl_context - ) - # functions that the server directly serves - self.server_process = None - self.resources = dict( - FILE_SERVER = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()), - GET = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()), - POST = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()), - PUT = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()), - DELETE = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()), - OPTIONS = dict(STATIC_ROUTES = dict(), DYNAMIC_ROUTES = dict()) + things=things, + port=port, + address=address, + host=host, + logger=logger, + log_level=log_level, + serializer=serializer or JSONSerializer(), + # protocol_version=1, + certfile=certfile, + keyfile=keyfile, + ssl_context=ssl_context, + # network_interface='Ethernet',# network_interface, + request_handler=kwargs.get('request_handler', RPCHandler), + event_handler=kwargs.get('event_handler', EventHandler), + allowed_clients=allowed_clients if allowed_clients is not None else [] ) - + self._type = HTTPServerTypes.THING_SERVER + self._lost_things = dict() # see update_router_with_thing + # self._zmq_protocol = zmq_protocol + # self._zmq_socket_context = context + # self._zmq_event_context = context + @property def all_ok(self) -> bool: - IP = "{}:{}".format(self.address, self.port) + self._IP = f"{self.address}:{self.port}" if self.logger is None: - self.logger = create_default_logger('{}|{}'.format(self.__class__.__name__, IP), self.log_level) - UtilitiesConsumer = Consumer(HTTPServerUtilities, logger = self.logger, db_config_file = None, - zmq_client_pool = None, instance_name = SERVER_INSTANCE_NAME, - remote_object_info = None) - if self.consumers is None: - self.consumers = [UtilitiesConsumer] - else: - self.consumers.append(UtilitiesConsumer) + self.logger = get_default_logger('{}|{}'.format(self.__class__.__name__, + f"{self.address}:{self.port}"), + self.log_level) + + self.app = Application(handlers=[ + (r'/remote-objects', ThingsHandler, dict(request_handler=self.request_handler, + event_handler=self.event_handler)) + ]) + + self.zmq_client_pool = MessageMappedZMQClientPool(self.things, identity=self._IP, + deserialize_server_messages=False, handshake=False, + json_serializer=self.serializer, context=self._zmq_socket_context, + protocol=self._zmq_protocol) + + event_loop = asyncio.get_event_loop() + event_loop.call_soon(lambda : asyncio.create_task(self.update_router_with_things())) + event_loop.call_soon(lambda : asyncio.create_task(self.subscribe_to_host())) + event_loop.call_soon(lambda : asyncio.create_task(self.zmq_client_pool.poll()) ) + for client in self.zmq_client_pool: + event_loop.call_soon(lambda : asyncio.create_task(client._handshake(timeout=60000))) + + # if self.protocol_version == 2: + # raise NotImplementedError("Current HTTP2 is not implemented.") + # self.tornado_instance = TornadoHTTP2Server(self.app, ssl_options=self.ssl_context) + # else: + self.tornado_instance = TornadoHTTP1Server(self.app, ssl_options=self.ssl_context) return True + - def start(self) -> None: - assert self.all_ok, 'HTTPServer all is not ok before starting' # Will always be True or cause some other exception - block : bool = True # currently block feature is not working with SSLContext due to pickling limitation of SSLContext - if block: - start_server(self.address, self.port, self.logger, self.subscription, self.consumers, self.resources, #type: ignore - self.ssl_context, self.json_serializer) - else: - self.server_process = Process(target = start_server, args = (self.address, self.port, self.logger, - self.subscription, self.consumers, self.resources, self.ssl_context, self.json_serializer)) - self.server_process.start() - - def stop(self) -> None: - if self.server_process: + async def subscribe_to_host(self): + if self.host is None: + return + client = AsyncHTTPClient() + for i in range(300): # try for five minutes try: - self.server_process.close() - except ValueError: - self.server_process.kill() - self.server_process = None + res = await client.fetch(HTTPRequest( + url=f"{self.host}/subscribers", + method='POST', + body=JSONSerializer.dumps(dict( + hostname=socket.gethostname(), + IPAddress=get_IP_from_interface(self.network_interface), + port=self.port, + type=self._type, + https=self.ssl_context is not None + )), + validate_cert=False, + headers={"content-type" : "application/json"} + )) + except Exception as ex: + self.logger.error(f"Could not subscribe to host {self.host}. error : {str(ex)}, error type : {type(ex)}.") + if i >= 299: + raise ex from None + else: + if res.code in [200, 201]: + self.logger.info(f"subsribed successfully to host {self.host}") + break + elif i >= 299: + raise RuntimeError(f"could not subsribe to host {self.host}. response {JSONSerializer.loads(res.body)}") + await asyncio.sleep(1) + # we lose the client anyway so we close it. if we decide to reuse the client, changes needed + client.close() - def http_method_decorator(self, http_method : str, URL_path = USE_OBJECT_NAME): - def decorator(given_func): - func = remote_method(URL_path=decorator.URL_path, - http_method=decorator.http_method)(given_func) - self.resources[http_method]["STATIC_ROUTES"][decorator.URL_path] = HTTPServerResourceData ( - what=CALLABLE, - instance_name='', - instruction=func, - fullpath=decorator.URL_path, - ) - - return func - decorator.http_method = http_method - decorator.URL_path = URL_path - return decorator - def get(self, URL_path = USE_OBJECT_NAME): - return self.http_method_decorator(GET, URL_path) - - def post(self, URL_path = USE_OBJECT_NAME): - return self.http_method_decorator(POST, URL_path) - - def put(self, URL_path = USE_OBJECT_NAME): - return self.http_method_decorator(PUT, URL_path) + def listen(self) -> None: + """ + Start HTTP server. This method is blocking, async event loops intending to schedule the HTTP server should instead use + the inner tornado instance's (``HTTPServer.tornado_instance``) listen() method. + """ + assert self.all_ok, 'HTTPServer all is not ok before starting' # Will always be True or cause some other exception + self.event_loop = ioloop.IOLoop.current() + self.tornado_instance.listen(port=self.port, address=self.address) + self.logger.info(f'started webserver at {self._IP}, ready to receive requests.') + self.event_loop.start() - def delete(self, URL_path = USE_OBJECT_NAME): - return self.http_method_decorator(DELETE, URL_path) - + def stop(self) -> None: + self.tornado_instance.stop() + run_coro_sync(self.tornado_instance.close_all_connections()) + self.event_loop.close() -def start_server(address : str, port : int, logger : logging.Logger, subscription : str, - consumers : List[Union[Consumer, RemoteObject, str]], resources : Dict[str, Any], ssl_context : ssl.SSLContext, - json_serializer : JSONSerializer) -> None: - """ - A separate function exists to start the server to be able to fork from current process - """ - event_loop = ioloop.IOLoop.current() - event_loop.run_sync(lambda : _setup_server(address, port, logger, subscription, consumers, resources, ssl_context, - json_serializer)) - # written as async function because zmq client is async, therefore run_sync for current use - if BaseRequestHandler.zmq_client_pool is not None: - event_loop.add_callback(BaseRequestHandler.zmq_client_pool.poll) - event_loop.start() + async def update_router_with_things(self) -> None: + """ + updates HTTP router with paths from ``Thing`` (s) + """ + await asyncio.gather(*[self.update_router_with_thing(client) for client in self.zmq_client_pool]) + + async def update_router_with_thing(self, client : AsyncZMQClient): + if client.instance_name in self._lost_things: + # Just to avoid duplication of this call as we proceed at single client level and not message mapped level + return + self._lost_things[client.instance_name] = client + self.logger.info(f"attempting to update router with remote object {client.instance_name}.") + while True: + try: + await client.handshake_complete() + resources = dict() # type: typing.Dict[str, HTTPResource] + reply = (await client.async_execute( + instruction=CommonRPC.http_resource_read(client.instance_name), + raise_client_side_exception=True + ))[ServerMessage.DATA] + resources.update(reply) -async def _setup_server(address : str, port : int, logger : logging.Logger, subscription : str, - consumers : List[Union[Consumer, RemoteObject, str]], resources : Dict[str, Dict[str, Any]], - ssl_context : ssl.SSLContext, json_serializer : JSONSerializer, version : float = 2) -> None: - IP = "{}:{}".format(address, port) - instance_names = [] - server_remote_objects = {} - remote_object_info = [] - if consumers is not None: - for consumer in consumers: - if isinstance(consumer, RemoteObject): - server_remote_objects[consumer.instance_name] = consumer - update_resources(resources, consumer.httpserver_resources) - remote_object_info.append(consumer.object_info) - elif isinstance(consumer, Consumer): - instance = consumer.consumer(*consumer.args, **consumer.kwargs) - server_remote_objects[instance.instance_name] = instance - update_resources(resources, instance.httpserver_resources) - remote_object_info.append(instance.object_info) - else: - instance_names.append(consumer) - - zmq_client_pool = MessageMappedZMQClientPool(instance_names, IP, json_serializer = json_serializer) - for client in zmq_client_pool: - await client.handshake_complete() - _, _, _, _, _, reply = await client.read_attribute('/'+client.server_instance_name + '/resources/http', raise_client_side_exception = True) - update_resources(resources, reply["returnValue"]) # type: ignore - _, _, _, _, _, reply = await client.read_attribute('/'+client.server_instance_name + '/object-info', raise_client_side_exception = True) - remote_object_info.append(RemoteObjectDB.RemoteObjectInfo(**reply["returnValue"])) # Should raise an exception if returnValue key is not found for some reason. - - for RO in server_remote_objects.values(): - if isinstance(RO, HTTPServerUtilities): - RO.zmq_client_pool = zmq_client_pool - RO.remote_object_info = remote_object_info - RO._httpserver_resources = resources - if subscription: - await RO.subscribe_to_host(subscription, port) - break - - BaseRequestHandler.zmq_client_pool = zmq_client_pool - BaseRequestHandler.json_serializer = zmq_client_pool.json_serializer - BaseRequestHandler.local_objects = server_remote_objects - GetResource.resources = resources.get(GET, dict()) - PostResource.resources = resources.get(POST, dict()) - PutResource.resources = resources.get(PUT, dict()) - DeleteResource.resources = resources.get(DELETE, dict()) - OptionsResource.resources = resources.get(OPTIONS, dict()) - # log_resources(logger, resources) - Router = CustomRouter(Application(), logger, IP, resources.get('FILE_SERVER')) - # if version == 2: - # S = TornadoHTTP2Server(Router, ssl_options=ssl_context) - # else: - S = TornadoHTTP1Server(Router, ssl_options=ssl_context) - S.listen(port = port, address = address) + handlers = [] + for instruction, http_resource in resources.items(): + if http_resource["what"] in [ResourceTypes.PROPERTY, ResourceTypes.ACTION] : + resource = HTTPResource(**http_resource) + handlers.append((resource.fullpath, self.request_handler, dict( + resource=resource, + owner=self + ))) + elif http_resource["what"] == ResourceTypes.EVENT: + resource = ServerSentEvent(**http_resource) + handlers.append((instruction, self.event_handler, dict( + resource=resource, + owner=self + ))) + """ + for handler based tornado rule matcher, the Rule object has following + signature + + def __init__( + self, + matcher: "Matcher", + target: Any, + target_kwargs: Optional[Dict[str, Any]] = None, + name: Optional[str] = None, + ) -> None: + matcher - based on route + target - handler + target_kwargs - given to handler's initialize + name - ... -__all__ = ['HTTPServer'] \ No newline at end of file + len == 2 tuple is route + handler + len == 3 tuple is route + handler + target kwargs + + so we give (path, RPCHandler, {'resource' : HTTPResource}) + + path is extracted from remote_method(URL_path='....') + RPCHandler is the base handler of this package for RPC purposes + resource goes into target kwargs as the HTTPResource generated by + remote_method and RemoteParamater contains all the info given + to make RPCHandler work + """ + self.app.wildcard_router.add_rules(handlers) + self.logger.info(f"updated router with remote object {client.instance_name}.") + break + except Exception as ex: + self.logger.error(f"error while trying to update router with remote object - {str(ex)}. " + + "Trying again in 5 seconds") + await asyncio.sleep(5) + + try: + reply = (await client.async_execute( + instruction=CommonRPC.object_info_read(client.instance_name), + raise_client_side_exception=True + ))[ServerMessage.DATA] + object_info = ThingInformation(**reply) + object_info.http_server ="{}://{}:{}".format("https" if self.ssl_context is not None else "http", + socket.gethostname(), self.port) + + await client.async_execute( + instruction=CommonRPC.object_info_write(client.instance_name), + arguments=dict(value=object_info), + raise_client_side_exception=True + ) + except Exception as ex: + self.logger.error(f"error while trying to update remote object with HTTP server details - {str(ex)}. " + + "Trying again in 5 seconds") + self.zmq_client_pool.poller.register(client.socket, zmq.POLLIN) + self._lost_things.pop(client.instance_name) + + +__all__ = [ + HTTPServer.__name__ +] \ No newline at end of file diff --git a/hololinked/server/__init__.py b/hololinked/server/__init__.py index 414ba26..08e5508 100644 --- a/hololinked/server/__init__.py +++ b/hololinked/server/__init__.py @@ -1,15 +1,14 @@ -__version__ = "0.1" - # Order of import is reflected in this file to avoid circular imports -from .config import * +from .constants import * from .serializers import * +from .config import * from .zmq_message_brokers import * +from .events import * +from .action import * +from .property import * from .database import * -from .decorators import * -from .remote_parameter import * -from .remote_object import * +from .thing import * from .eventloop import * -from .proxy_client import * from .HTTPServer import * -from .host_utilities import * -from .host_server import * + + diff --git a/hololinked/server/action.py b/hololinked/server/action.py new file mode 100644 index 0000000..1e95069 --- /dev/null +++ b/hololinked/server/action.py @@ -0,0 +1,103 @@ +import typing +from enum import Enum +from types import FunctionType +from inspect import iscoroutinefunction, getfullargspec + +from .data_classes import RemoteResourceInfoValidator +from .constants import USE_OBJECT_NAME, UNSPECIFIED, HTTP_METHODS, JSON + + + +def action(URL_path : str = USE_OBJECT_NAME, http_method : str = HTTP_METHODS.POST, + state : typing.Optional[typing.Union[str, Enum]] = None, input_schema : typing.Optional[JSON] = None, + output_schema : typing.Optional[JSON] = None, create_task : bool = False) -> typing.Callable: + """ + Use this function as a decorate on your methods to make them accessible remotely. For WoT, an action affordance schema + for the method is generated. + + Parameters + ---------- + URL_path: str, optional + The path of URL under which the object is accessible. defaults to name of the object. + http_method: str, optional + HTTP method (GET, POST, PUT etc.). defaults to POST. + state: str | Tuple[str], optional + state machine state under which the object can executed. When not provided, + the action can be executed under any state. + input_schema: JSON + schema for arguments to validate them. + output_schema : JSON + schema for return value, currently only used to inform clients which is supposed to validate on its won. + + Returns + ------- + Callable + returns the callable object as it is + """ + + def inner(obj): + original = obj + if isinstance(obj, classmethod): + obj = obj.__func__ + if obj.__name__.startswith('__'): + raise ValueError(f"dunder objects cannot become remote : {obj.__name__}") + if callable(obj): + if hasattr(obj, '_remote_info') and not isinstance(obj._remote_info, RemoteResourceInfoValidator): + raise NameError( + "variable name '_remote_info' reserved for hololinked package. ", + "Please do not assign this variable to any other object except hololinked.server.data_classes.RemoteResourceInfoValidator." + ) + else: + obj._remote_info = RemoteResourceInfoValidator() + obj_name = obj.__qualname__.split('.') + if len(obj_name) > 1: # i.e. its a bound method, used by Thing + if URL_path == USE_OBJECT_NAME: + obj._remote_info.URL_path = f'/{obj_name[1]}' + else: + if not URL_path.startswith('/'): + raise ValueError(f"URL_path should start with '/', please add '/' before '{URL_path}'") + obj._remote_info.URL_path = URL_path + obj._remote_info.obj_name = obj_name[1] + elif len(obj_name) == 1 and isinstance(obj, FunctionType): # normal unbound function - used by HTTPServer instance + if URL_path is USE_OBJECT_NAME: + obj._remote_info.URL_path = '/{}'.format(obj_name[0]) + else: + if not URL_path.startswith('/'): + raise ValueError(f"URL_path should start with '/', please add '/' before '{URL_path}'") + obj._remote_info.URL_path = URL_path + obj._remote_info.obj_name = obj_name[0] + else: + raise RuntimeError(f"Undealt option for decorating {obj} or decorators wrongly used") + if http_method is not UNSPECIFIED: + if isinstance(http_method, str): + obj._remote_info.http_method = (http_method,) + else: + obj._remote_info.http_method = http_method + if state is not None: + if isinstance(state, (Enum, str)): + obj._remote_info.state = (state,) + else: + obj._remote_info.state = state + if 'request' in getfullargspec(obj).kwonlyargs: + obj._remote_info.request_as_argument = True + obj._remote_info.isaction = True + obj._remote_info.iscoroutine = iscoroutinefunction(obj) + obj._remote_info.argument_schema = input_schema + obj._remote_info.return_value_schema = output_schema + obj._remote_info.obj = original + return original + else: + raise TypeError( + "target for action or is not a function/method. " + + f"Given type {type(obj)}" + ) + + return inner + + + +__all__ = [ + action.__name__ +] + + diff --git a/hololinked/server/api_platform_utils.py b/hololinked/server/api_platform_utils.py index 707a692..9bc14db 100644 --- a/hololinked/server/api_platform_utils.py +++ b/hololinked/server/api_platform_utils.py @@ -1,7 +1,7 @@ from typing import Dict, List, Any, Union from dataclasses import dataclass, field, asdict -from .constants import POST +from .constants import HTTP_METHODS from .serializers import JSONSerializer @@ -21,9 +21,60 @@ def add_item(self, item : Union["postman_item", "postman_itemgroup"]) -> None: def json(self): return asdict(self) - def json_file(self, filename = 'collection.json'): + def save_json_file(self, filename = 'collection.json'): with open(filename, 'w') as file: - JSONSerializer.general_dump(self.json(), file) + JSONSerializer.generic_dump(self.json(), file) + + @classmethod + def build(cls, instance, domain_prefix : str) -> Dict[str, Any]: + from .thing import Thing + from .data_classes import HTTPResource, RemoteResource + assert isinstance(instance, Thing) # type definition + try: + return instance._postman_collection + except AttributeError: + pass + properties_folder = postman_itemgroup(name='properties') + methods_folder = postman_itemgroup(name='methods') + events_folder = postman_itemgroup(name='events') + + collection = postman_collection( + info = postman_collection_info( + name = instance.__class__.__name__, + description = "API endpoints available for Thing", + ), + item = [ + properties_folder, + methods_folder + ] + ) + + for http_method, resource in instance.httpserver_resources.items(): + # i.e. this information is generated only on the httpserver accessible resrouces... + for URL_path, httpserver_data in resource.items(): + if isinstance(httpserver_data, HTTPResource): + scada_info : RemoteResource + try: + scada_info = instance.instance_resources[httpserver_data.instruction] + except KeyError: + property_path_without_RW = httpserver_data.instruction.rsplit('/', 1)[0] + scada_info = instance.instance_resources[property_path_without_RW] + item = postman_item( + name = scada_info.obj_name, + request = postman_http_request( + description=scada_info.obj.__doc__, + url=domain_prefix + URL_path, + method=http_method, + ) + ) + if scada_info.isproperty: + properties_folder.add_item(item) + elif scada_info.iscallable: + methods_folder.add_item(item) + + instance._postman_collection = collection + return collection + @dataclass class postman_collection_info: @@ -58,7 +109,7 @@ class postman_http_request: url : str header : Union[List[Dict[str, Any]], None] = field(default=None) body : Union[Dict[str, Any], None] = field(default=None) - method : str = field(default=POST) + method : str = field(default=HTTP_METHODS.POST) description : Union[str, None] = field(default=None) def json(self): diff --git a/hololinked/server/config.py b/hololinked/server/config.py index 1c3f964..31bd37d 100644 --- a/hololinked/server/config.py +++ b/hololinked/server/config.py @@ -1,5 +1,4 @@ # adapted from pyro - https://github.com/irmen/Pyro5 - see following license -# currently not used correctly because its not correctly integrated to the package """ MIT License @@ -23,88 +22,118 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ - import tempfile -import os -import platform -from . import __version__ +import os +import typing +import warnings + +from .. import __version__ +from .serializers import PythonBuiltinJSONSerializer class Configuration: + """ + Allows to auto apply common settings used throughout the package, + instead of passing these settings as arguments. Import ``global_config`` variable + instead of instantitation this class. + + Supports loading configuration from a JSON file whose path is specified + under environment variable HOLOLINKED_CONFIG. + + Values are mutable in runtime and not type checked. Keys of JSON file + must correspond to supported value name. Supported values are - + + TEMP_DIR - system temporary directory to store temporary files like IPC sockets. + default - tempfile.gettempdir(). + + TCP_SOCKET_SEARCH_START_PORT - starting port number for automatic port searching + for TCP socket binding, used for event addresses. default 60000. + + TCP_SOCKET_SEARCH_END_PORT - ending port number for automatic port searching + for TCP socket binding, used for event addresses. default 65535. + + DB_CONFIG_FILE - file path for database configuration. default None. + + SYSTEM_HOST - system view server qualified IP or name. default None. + + PWD_HASHER_TIME_COST - system view server password authentication time cost, + default 15. Refer argon2-cffi docs. + + PWD_HASHER_MEMORY_COST - system view server password authentication memory cost. + Refer argon2-cffi docs. + + USE_UVLOOP - signicantly faster event loop for Linux systems. Reads data from network faster. default False. + + Parameters + ---------- + use_environment: bool + load files from JSON file specified under environment + + """ __slots__ = [ - "APPDATA_DIR", "PRIMARY_HOST", "LOCALHOST_PORT" + # folders + "TEMP_DIR", + # TCP socket + "TCP_SOCKET_SEARCH_START_PORT", "TCP_SOCKET_SEARCH_END_PORT", + # system view + "PRIMARY_HOST", "LOCALHOST_PORT", + # database + "DB_CONFIG_FILE", + # HTTP server + "COOKIE_SECRET", + # credentials + "PWD_HASHER_TIME_COST", "PWD_HASHER_MEMORY_COST", + # Eventloop + "USE_UVLOOP" ] - def __init__(self): - self.reset_variables() + def __init__(self, use_environment : bool = False): + self.load_variables(use_environment) self.reset_actions() - - def reset_variables(self, use_environment : bool = True): + def load_variables(self, use_environment : bool = False): """ - Reset to default config items. - If use_environment is False, won't read environment variables settings (useful if you can't trust your env). + set default values & use the values from environment file. + Set use_environment to False to not use environment file. """ - self.APPDATA_DIR = tempfile.gettempdir() + "\\hololinked" - return - # qualname is not defined - if use_environment: - # environment variables overwrite config items - prefix = __qualname__.split('.')[0] - for item, envvalue in (e for e in os.environ.items() if e[0].startswith(prefix)): - item = item[len(prefix):] - if item not in self.__slots__: - raise ValueError(f"invalid environment config variable: {prefix}{item}") - value = getattr(self, item) - valuetype = type(value) - if valuetype is set: - envvalue = {v.strip() for v in envvalue.split(",")} - elif valuetype is list: - envvalue = [v.strip() for v in envvalue.split(",")] - elif valuetype is bool: - envvalue = envvalue.lower() - if envvalue in ("0", "off", "no", "false"): - envvalue = False - elif envvalue in ("1", "yes", "on", "true"): - envvalue = True - else: - raise ValueError(f"invalid boolean value: {prefix}{item}={envvalue}") - else: - try: - envvalue = valuetype(envvalue) - except ValueError: - raise ValueError(f"invalid environment config value: {prefix}{item}={envvalue}") from None - setattr(self, item, envvalue) + self.TEMP_DIR = f"{tempfile.gettempdir()}{os.sep}hololinked" + self.TCP_SOCKET_SEARCH_START_PORT = 60000 + self.TCP_SOCKET_SEARCH_END_PORT = 65535 + self.PWD_HASHER_TIME_COST = 15 + self.USE_UVLOOP = False + + if not use_environment: + return + # environment variables overwrite config items + file = os.environ.get("HOLOLINKED_CONFIG", None) + if not file: + warnings.warn("no environment file found although asked to load from one", UserWarning) + return + with open(file, "r") as file: + config = PythonBuiltinJSONSerializer.load(file) # type: typing.Dict + for item, value in config.items(): + setattr(self, item, value) def reset_actions(self): + "actions to be done to reset configurations (not actions to be called)" try: - os.mkdir(self.APPDATA_DIR) + os.mkdir(self.TEMP_DIR) except FileExistsError: pass def copy(self): - """returns a copy of this config""" + "returns a copy of this config as another object" other = object.__new__(Configuration) for item in self.__slots__: setattr(other, item, getattr(self, item)) return other def asdict(self): - """returns this config as a regular dictionary""" + "returns this config as a regular dictionary" return {item: getattr(self, item) for item in self.__slots__} - - def dump(self): - """Easy config diagnostics""" - return { - "version" : __version__, - "path" : os.path.dirname(__file__), - "python" : f"{platform.python_implementation()} {platform.python_version()} ({platform.system()}, {os.name})" , - "configuration" : self.asdict() - } - - - + + global_config = Configuration() diff --git a/hololinked/server/constants.py b/hololinked/server/constants.py index d3f0ed1..d38b538 100644 --- a/hololinked/server/constants.py +++ b/hololinked/server/constants.py @@ -1,10 +1,13 @@ +import zmq import logging -import functools import typing -from enum import Enum -from types import MethodType, FunctionType +from enum import StrEnum, IntEnum, Enum +# types +JSONSerializable = typing.Union[typing.Dict[str, typing.Any], list, str, int, float, None] +JSON = typing.Dict[str, JSONSerializable] + # decorator constants # naming USE_OBJECT_NAME : str = "USE_OBJECT_NAME" @@ -12,48 +15,185 @@ ANY_STATE : str = "ANY_STATE" UNSPECIFIED : str = "UNSPECIFIED" # types -FUNC = "FUNC" -ATTRIBUTE = "ATTRIBUTE" -IMAGE_STREAM = "IMAGE_STREAM" -CALLABLE = "CALLABLE" -FILE = "FILE" -# operation -READ = "read" -WRITE = "write" - -# logic -WRAPPER_ASSIGNMENTS = functools.WRAPPER_ASSIGNMENTS + ('__kwdefaults__', '__defaults__', ) -SERIALIZABLE_WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__qualname__', '__doc__', '__kwdefaults__', '__defaults__', ) -# regex logic -states_regex : str = '[A-Za-z_]+[A-Za-z_ 0-9]*' -url_regex : str = r'[\-a-zA-Z0-9@:%._\/\+~#=]{1,256}' -instance_name_regex : str = r'[A-Za-z]+[A-Za-z_0-9\-\/]*' - -# HTTP request methods -GET : str = 'GET' -POST : str = 'POST' -PUT : str = 'PUT' -DELETE : str = 'DELETE' -PATCH : str = 'PATCH' -OPTIONS : str = 'OPTIONS' -http_methods = [GET, PUT, POST, DELETE, PATCH] -# HTTP Handler related -EVENT : str = 'event' -INSTRUCTION : str = 'INSTRUCTION' + + +class ResourceTypes(StrEnum): + """ + Exposed resource types. This repository does not rely on python's internal getattr, setattr and type checking, + to expose RPC operations, but rather segrates resource types and operates on them in predefined ways. This is to allow + addition or removal of resource types which may demand more integration with HTTP. + """ + PROPERTY = "PROPERTY" + ACTION = "ACTION" + EVENT = "EVENT" + IMAGE_STREAM = "IMAGE_STREAM" + FILE = "FILE" + + +class CommonRPC(StrEnum): + """some common RPC and their associated instructions for quick access by lower level code""" + + RPC_RESOURCES = '/resources/object-proxy' + HTTP_RESOURCES = '/resources/http-server' + OBJECT_INFO = '/object-info' + PING = '/ping' + + @classmethod + def rpc_resource_read(cls, instance_name : str) -> str: + return f"/{instance_name}{cls.RPC_RESOURCES}/read" + + @classmethod + def http_resource_read(cls, instance_name : str) -> str: + return f"/{instance_name}{cls.HTTP_RESOURCES}/read" + + @classmethod + def object_info_read(cls, instance_name : str) -> str: + return f"/{instance_name}{cls.OBJECT_INFO}/read" + + @classmethod + def object_info_write(cls, instance_name : str) -> str: + return f"/{instance_name}{cls.OBJECT_INFO}/write" + + +class REGEX(StrEnum): + """common regexes""" + + states = '[A-Za-z_]+[A-Za-z_ 0-9]*' + url = r'[\-a-zA-Z0-9@:%._\/\+~#=]{1,256}' + + +class HTTP_METHODS(StrEnum): + """currently supported HTTP request methods""" + + GET = 'GET' + POST = 'POST' + PUT = 'PUT' + DELETE = 'DELETE' + PATCH = 'PATCH' + # OPTIONS = 'OPTIONS' + +http_methods = [member for member in HTTP_METHODS._member_map_] + # Logging -log_levels = dict( - DEBUG = logging.DEBUG, - INFO = logging.INFO, - CRITICAL = logging.CRITICAL, - ERROR = logging.ERROR, - WARN = logging.WARN, +class LOGLEVEL(IntEnum): + """``logging.Logger`` log levels""" + DEBUG = logging.DEBUG + INFO = logging.INFO + CRITICAL = logging.CRITICAL + ERROR = logging.ERROR + WARN = logging.WARN FATAL = logging.FATAL -) -# types -CallableType = (FunctionType, MethodType) -JSONSerializable = typing.Union[typing.Dict[str, typing.Any], list, str, int, float, None] # ZMQ -ZMQ_PROTOCOLS = Enum('ZMQ_PROTOCOLS', 'TCP IPC') \ No newline at end of file +class ZMQ_PROTOCOLS(StrEnum): + """ + supported ZMQ transport protocols - TCP, IPC, INPROC + + TCP - needs socket address additional specified like tcp://0.0.0.0:{port} + IPC - within python, multiprocess applications, can be autogenerated + INPROC - within python, multithreaded applications, can be autogenerated + """ + TCP = "TCP" + IPC = "IPC" + INPROC = "INPROC" + + +class ClientMessage(IntEnum): + """ + ZMQ client sent message indexing for accessing message indices with names + instead of numbers + """ + ADDRESS = 0 + CLIENT_TYPE = 2 + MESSAGE_TYPE = 3 + MESSAGE_ID = 4 + TIMEOUT = 5 + INSTRUCTION = 6 + ARGUMENTS = 7 + EXECUTION_CONTEXT = 8 + + +class ServerMessage(IntEnum): + """ + ZMQ server sent message indexing for accessing message indices with names + instead of numbers + """ + ADDRESS = 0 + SERVER_TYPE = 2 + MESSAGE_TYPE = 3 + MESSAGE_ID = 4 + DATA = 5 + ENCODED_DATA = 6 + + +class ServerTypes(Enum): + "type of ZMQ servers" + + UNKNOWN_TYPE = b'UNKNOWN' + EVENTLOOP = b'EVENTLOOP' + THING = b'THING' + POOL = b'POOL' + + +class ClientTypes(Enum): + "type of ZMQ clients" + + HTTP_SERVER = b'HTTP_SERVER' + PROXY = b'PROXY' + TUNNELER = b'TUNNELER' # message passer from inproc client to inrproc server within RPC + + +class HTTPServerTypes(StrEnum): + "types of HTTP server" + + SYSTEM_HOST = 'SYSTEM_HOST' + THING_SERVER = 'THING_SERVER' + + +class Serializers(StrEnum): + """ + allowed serializers + + - PICKLE : pickle + - JSON : msgspec.json + - SERPENT : serpent + - MSGPACK : msgspec.msgpack + """ + PICKLE = 'pickle' + JSON = 'json' + SERPENT = 'serpent' + MSGPACK = 'msgpack' + + +class ZMQSocketType(IntEnum): + PAIR = zmq.PAIR + PUB = zmq.PUB + SUB = zmq.SUB + REQ = zmq.REQ + REP = zmq.REP + DEALER = zmq.DEALER + ROUTER = zmq.ROUTER + PULL = zmq.PULL + PUSH = zmq.PUSH + XPUB = zmq.XPUB + XSUB = zmq.XSUB + STREAM = zmq.STREAM + # Add more socket types as needed + + +ZMQ_EVENT_MAP = {} +for name in dir(zmq): + if name.startswith('EVENT_'): + value = getattr(zmq, name) + ZMQ_EVENT_MAP[value] = name + + + + +__all__ = [ + Serializers.__name__, + HTTP_METHODS.__name__, + ZMQ_PROTOCOLS.__name__ +] \ No newline at end of file diff --git a/hololinked/server/data_classes.py b/hololinked/server/data_classes.py index 047b0f7..7e685f4 100644 --- a/hololinked/server/data_classes.py +++ b/hololinked/server/data_classes.py @@ -1,96 +1,181 @@ +""" +The following is a list of all dataclasses used to store information on the exposed +resources on the network. These classese are generally not for consumption by the package-end-user. +""" import typing import platform +import inspect from enum import Enum from dataclasses import dataclass, asdict, field, fields +from types import FunctionType, MethodType -from ..param.parameters import String, Boolean, Tuple, TupleSelector -from .constants import (USE_OBJECT_NAME, POST, states_regex, url_regex, http_methods) -from .path_converter import compile_path - - - - - -class ScadaInfoValidator: - """ - A validator class for saving remote access related information, this is not for - direct usage by the package-end-user. Both callables (functions, methods and those with - __call__ ) and class/instance attributes store this information as their own attribute - under the name `scada_info`. The information (and the variable) may be deleted later (currently not) - from these objects under _prepare_instance in RemoteObject class. - - Args: - URL_path (str): the path in the URL under which the object is accesible for remote-operations. - Must follow url-regex requirement. If not specified, the name of object - has to be extracted and used. - http_method (str): HTTP method under which the object is accessible. Must be any of specified in - decorator methods. - state (str): State machine state at which the callable will be executed or attribute can be - written (does not apply to read-only attributes). - obj_name (str): the name of the object which will be supplied to the ProxyClient class to populate - its own namespace. For HTTP clients, HTTP method and URL is important and for Proxy clients - (based on ZMQ), the obj_name is important. - iscoroutine (bool): whether the callable should be executed with async requirements - is_method (bool): True when the callable is a function or method and not an arbitrary object with - __call__ method. This is required to decide how the callable is bound/unbound. - is_dunder_callable (bool): Not a function or method, but a callable. Same use case as the previous attribute. - Standard definition of callable is not used in the above two attributes - """ - URL_path = String(default=USE_OBJECT_NAME) #, regex=url_regex) - http_method = TupleSelector(default=POST, objects=http_methods, accept_list=True) - state = Tuple(default=None, item_type=(Enum, str), allow_None=True, accept_list=True, accept_item=True) - obj_name = String(default=USE_OBJECT_NAME) - iscoroutine = Boolean(default=False) - iscallable = Boolean(default=False) - isparameter = Boolean(default=False) - http_request_as_argument = Boolean(default=False) +from ..param.parameters import String, Boolean, Tuple, TupleSelector, TypedDict, ClassSelector, Parameter +from .constants import JSON, USE_OBJECT_NAME, UNSPECIFIED, HTTP_METHODS, REGEX, ResourceTypes, http_methods +from .utils import get_signature, getattr_without_descriptor_read + + + +class RemoteResourceInfoValidator: + """ + A validator class for saving remote access related information on a resource. Currently callables (functions, + methods and those with__call__) and class/instance property store this information as their own attribute under + the variable ``_remote_info``. This is later split into information suitable for HTTP server, RPC client & ``EventLoop``. + + Attributes + ---------- + + URL_path : str, default - extracted object name + the path in the URL under which the object is accesible. + Must follow url-regex ('[\-a-zA-Z0-9@:%._\/\+~#=]{1,256}') requirement. + If not specified, the name of object will be used. Underscores will be converted to dashes + for PEP 8 names. + http_method : str, default POST + HTTP request method under which the object is accessible. GET, POST, PUT, DELETE or PATCH are supported. + state : str, default None + State machine state at which a callable will be executed or attribute/property can be + written. Does not apply to read-only attributes/properties. + obj_name : str, default - extracted object name + the name of the object which will be supplied to the ``ObjectProxy`` class to populate + its own namespace. For HTTP clients, HTTP method and URL path is important and for + object proxies clients, the obj_name is important. + iscoroutine : bool, default False + whether the callable should be awaited + isaction : bool, default False + True for a method or function or callable + isproperty : bool, default False + True for a property + request_as_argument : bool, default False + if True, http/RPC request object will be passed as an argument to the callable. + The user is warned to not use this generally. + argument_schema: JSON, default None + JSON schema validations for arguments of a callable. Assumption is therefore arguments will be JSON complaint. + return_value_schema: JSON, default None + schema for return value of a callable + """ + + URL_path = String(default=USE_OBJECT_NAME, + doc="the path in the URL under which the object is accesible.") # type: str + http_method = TupleSelector(default=HTTP_METHODS.POST, objects=http_methods, accept_list=True, + doc="HTTP request method under which the object is accessible. GET, POST, PUT, DELETE or PATCH are supported.") # typing.Tuple[str] + state = Tuple(default=None, item_type=(Enum, str), allow_None=True, accept_list=True, accept_item=True, + doc="State machine state at which a callable will be executed or attribute/property can be written.") # type: typing.Union[Enum, str] + obj = ClassSelector(default=None, allow_None=True, class_=(FunctionType, classmethod, Parameter, MethodType), # Property will need circular import so we stick to Parameter + doc="the unbound object like the unbound method") + obj_name = String(default=USE_OBJECT_NAME, + doc="the name of the object which will be supplied to the ``ObjectProxy`` class to populate its own namespace.") # type: str + iscoroutine = Boolean(default=False, + doc="whether the callable should be awaited") # type: bool + isaction = Boolean(default=False, + doc="True for a method or function or callable") # type: bool + isproperty = Boolean(default=False, + doc="True for a property") # type: bool + request_as_argument = Boolean(default=False, + doc="if True, http/RPC request object will be passed as an argument to the callable.") # type: bool + argument_schema = TypedDict(default=None, allow_None=True, key_type=str, + doc="JSON schema validations for arguments of a callable") + return_value_schema = TypedDict(default=None, allow_None=True, key_type=str, + doc="schema for return value of a callable") def __init__(self, **kwargs) -> None: + """ + No full-scale checks for unknown keyword arguments as the class + is used by the developer, so please try to be error-proof + """ + if kwargs.get('URL_path', None) is not None: + if not isinstance(kwargs['URL_path'], str): + raise TypeError(f"URL path must be a string. Given type {type(kwargs['URL_path'])}") + if kwargs["URL_path"] != USE_OBJECT_NAME and not kwargs["URL_path"].startswith('/'): + raise ValueError(f"URL path must start with '/'. Given value {kwargs['URL_path']}") for key, value in kwargs.items(): setattr(self, key, value) - # No full-scale checks for unknown keyword arguments as the class - # is used by the developer, so please try to be error-proof - def create_dataclass(self, obj : typing.Optional[typing.Any] = None, - bound_obj : typing.Optional[typing.Any] = None) -> "ScadaInfoData": + def to_dataclass(self, obj : typing.Any = None, bound_obj : typing.Any = None) -> "RemoteResource": """ - For a plain, faster and uncomplicated access, a dataclass in created + For a plain, faster and uncomplicated access, a dataclass in created & used by the + event loop. + + Parameters + ---------- + obj : Union[Property | Callable] + property or method/action + + bound_obj : owner instance + ``Thing`` instance + + Returns + ------- + RemoteResource + dataclass equivalent of this object """ - return ScadaInfoData(URL_path=self.URL_path, http_method=self.http_method, + return RemoteResource( state=tuple(self.state) if self.state is not None else None, - obj_name=self.obj_name, iscallable=self.iscallable, iscoroutine=self.iscoroutine, - isparameter=self.isparameter, http_request_as_argument=self.http_request_as_argument, - obj=obj, bound_obj=bound_obj) + obj_name=self.obj_name, isaction=self.isaction, iscoroutine=self.iscoroutine, + isproperty=self.isproperty, obj=obj, bound_obj=bound_obj + ) # http method is manually always stored as a tuple -__use_slots_for_dataclass = False -if float('.'.join(platform.python_version().split('.')[0:2])) > 3.10: - __use_slots_for_dataclass = True +class SerializableDataclass: + """ + Presents uniform serialization for serializers using getstate and setstate and json + serialization. + """ + + def json(self): + return asdict(self) + + def __getstate__(self): + return self.json() + + def __setstate__(self, values : typing.Dict): + for key, value in values.items(): + setattr(self, key, value) + + +__dataclass_kwargs = dict(frozen=True) +if float('.'.join(platform.python_version().split('.')[0:2])) >= 3.11: + __dataclass_kwargs["slots"] = True -@dataclass(frozen=True, slots=__use_slots_for_dataclass) -class ScadaInfoData: +@dataclass(**__dataclass_kwargs) +class RemoteResource(SerializableDataclass): """ - This container class is created by the RemoteObject instance because descriptors (used by ScadaInfoValidator) - are generally slower. It is used by the eventloop methods while executing the remote object and is stored under - RemoteObject.instance_resources dictionary. + This container class is used by the ``EventLoop`` methods (for example ``execute_once()``) to access resource + metadata instead of directly using ``RemoteResourceInfoValidator``. Instances of this dataclass is stored under + ``Thing.instance_resources`` dictionary for each property & method/action. Events use similar dataclass with + metadata but with much less information. + + Attributes + ---------- + state : str + State machine state at which a callable will be executed or attribute/property can be + written. Does not apply to read-only attributes/properties. + obj_name : str, default - extracted object name + the name of the object which will be supplied to the ``ObjectProxy`` class to populate + its own namespace. For HTTP clients, HTTP method and URL path is important and for + object proxies clients, the obj_name is important. + iscoroutine : bool + whether the callable should be awaited + isaction : bool + True for a method or function or callable + isproperty : bool + True for a property + obj : Union[Property | Callable] + property or method/action + bound_obj : owner instance + ``Thing`` instance """ - URL_path : str - http_method : str state : typing.Optional[typing.Union[typing.Tuple, str]] obj_name : str - iscallable : bool + isaction : bool iscoroutine : bool - isparameter : bool - http_request_as_argument : bool + isproperty : bool obj : typing.Any - bound_obj : typing.Any - + bound_obj : typing.Any + def json(self): """ - Serilization method to access the container for HTTP clients. Set use_json_method = True - in serializers.JSONSerializer instance and pass the object to the serializer directly. + return this object as a JSON serializable dictionary """ # try: # return self._json # accessing dynamic attr from frozen object @@ -99,124 +184,399 @@ def json(self): for field in fields(self): if field.name != 'obj' and field.name != 'bound_obj': json_dict[field.name] = getattr(self, field.name) - # object.__setattr__(self, '_json', json_dict) # because object is frozen + # object.__setattr__(self, '_json', json_dict) # because object is frozen - used to work, but not now return json_dict @dataclass -class HTTPServerResourceData: +class HTTPMethodInstructions(SerializableDataclass): """ - Used by HTTPServer instance to decide where to route which instruction + contains a map of unique strings that identifies the resource operation for each HTTP method, thus acting as + instructions to be passed to the RPC server. The unique strings are generally made using the URL_path. + """ + GET : typing.Optional[str] = field(default=None) + POST : typing.Optional[str] = field(default=None) + PUT : typing.Optional[str] = field(default=None) + DELETE : typing.Optional[str] = field(default=None) + PATCH : typing.Optional[str] = field(default=None) + + def __post_init__(self): + self.supported_methods() + + def supported_methods(self): # can be a property + try: + return self._supported_methods + except: + self._supported_methods = [] + for method in ["GET", "POST", "PUT", "DELETE", "PATCH"]: + if isinstance(self.__dict__[method], str): + self._supported_methods.append(method) + return self._supported_methods + + def __contains__(self, value): + return value in self._supported_methods + + +@dataclass +class HTTPResource(SerializableDataclass): + """ + Representation of the resource used by HTTP server for routing and passing information to RPC server on + "what to do with which resource belonging to which thing? - read, write, execute?". + + Attributes + ---------- - 'what' can be an 'ATTRIBUTE' or 'CALLABLE' (based on isparameter or iscallable) and 'instruction' - stores the instructions to be sent to the eventloop. 'instance_name' maps the instruction to a particular - instance of RemoteObject + what : str + is it a property, method/action or event? + instance_name : str + The ``instance_name`` of the thing which owns the resource. Used by HTTP server to inform + the message brokers to send the instruction to the correct recipient thing. + fullpath : str + URL full path used for routing + instructions : HTTPMethodInstructions + unique string that identifies the resource operation for each HTTP method, generally made using the URL_path + (qualified URL path {instance name}/{URL path}). + argument_schema : JSON + argument schema of the method/action for validation before passing over the instruction to the RPC server. + request_as_argument: bool + pass the request as a argument to the callable. For HTTP server ``tornado.web.HTTPServerRequest`` will be passed. """ what : str instance_name : str - instruction : str - http_request_as_argument : bool = field( default = False ) - path_format : typing.Optional[str] = field( default=None ) - path_regex : typing.Optional[typing.Pattern] = field( default = None ) - param_convertors : typing.Optional[typing.Dict] = field( default = None ) - - def __init__(self, *, what : str, instance_name : str, fullpath : str, instruction : str, - http_request_as_argument : bool = False) -> None: + obj_name : str + fullpath : str + instructions : HTTPMethodInstructions + argument_schema : typing.Optional[JSON] + return_value_schema : typing.Optional[JSON] + request_as_argument : bool = field(default=False) + + def __init__(self, *, what : str, instance_name : str, obj_name : str, fullpath : str, + request_as_argument : bool = False, argument_schema : typing.Optional[JSON] = None, + return_value_schema : typing.Optional[JSON] = None, **instructions) -> None: self.what = what self.instance_name = instance_name + self.obj_name = obj_name self.fullpath = fullpath - self.instruction = instruction - self.http_request_as_argument = http_request_as_argument - - def compile_path(self): - path_regex, self.path_format, param_convertors = compile_path(self.fullpath) - if self.path_format == self.fullpath and len(param_convertors) == 0: - self.path_regex = None - self.param_convertors = None - elif self.path_format != self.fullpath and len(param_convertors) == 0: - raise RuntimeError(f"Unknown path format found '{self.path_format}' for path '{self.fullpath}', no path converters were created.") - else: - self.path_regex = path_regex - self.param_convertors = param_convertors - - def json(self): - return { - "what" : self.what, - "instance_name" : self.instance_name, - 'fullpath' : self.fullpath, - "instruction" : self.instruction, - "http_request_as_argument" : self.http_request_as_argument - } - + self.request_as_argument = request_as_argument + self.argument_schema = argument_schema + self.return_value_schema = return_value_schema + if instructions.get('instructions', None): + self.instructions = HTTPMethodInstructions(**instructions.get('instructions', None)) + else: + self.instructions = HTTPMethodInstructions(**instructions) + @dataclass -class FileServerData: - what : str - directory : str - fullpath : str - path_format : typing.Optional[str] = field( default=None ) - path_regex : typing.Optional[typing.Pattern] = field( default = None ) - param_convertors : typing.Optional[typing.Dict] = field( default = None ) +class RPCResource(SerializableDataclass): + """ + Representation of resource used by RPC clients for mapping client method/action calls, property read/writes & events + to a server resource. Used to dynamically populate the ``ObjectProxy`` - def json(self): - return asdict(self) - - def compile_path(self): - path_regex, self.path_format, param_convertors = compile_path(self.fullpath) - if self.path_format == self.fullpath and len(param_convertors) == 0: - self.path_regex = None - self.param_convertors = None - elif self.path_format != self.fullpath and len(param_convertors) == 0: - raise RuntimeError(f"Unknown path format found '{self.path_format}' for path '{self.fullpath}', no path converters were created.") - else: - self.path_regex = path_regex - self.param_convertors = param_convertors - + Attributes + ---------- -@dataclass(frozen = True) -class HTTPServerEventData: - """ - Used by the HTTPServer instance to subscribe to events published by RemoteObject at a certain address. - The events are sent to the HTTP clients using server-sent-events. + what : str + is it a property, method/action or event? + instance_name : str + The ``instance_name`` of the thing which owns the resource. Used by RPC client to inform + message brokers to send the message to the correct recipient. + instruction : str + unique string that identifies the resource, generally made using the URL_path. Although URL path is a HTTP + concept, it is still used as a unique identifier. + name : str + the name of the resource (__name__) + qualname : str + the qualified name of the resource (__qualname__) + doc : str + the docstring of the resource + argument_schema : JSON + argument schema of the method/action for validation before passing over the instruction to the RPC server. """ what : str - event_name : str - socket_address : str + instance_name : str + instruction : str + obj_name : str + qualname : str + doc : typing.Optional[str] + top_owner : bool + argument_schema : typing.Optional[JSON] + return_value_schema : typing.Optional[JSON] + request_as_argument : bool = field(default=False) - def json(self): - return asdict(self) - - + def __init__(self, *, what : str, instance_name : str, instruction : str, obj_name : str, + qualname : str, doc : str, top_owner : bool, argument_schema : typing.Optional[JSON] = None, + return_value_schema : typing.Optional[JSON] = None, request_as_argument : bool = False) -> None: + self.what = what + self.instance_name = instance_name + self.instruction = instruction + self.obj_name = obj_name + self.qualname = qualname + self.doc = doc + self.top_owner = top_owner + self.argument_schema = argument_schema + self.return_value_schema = return_value_schema + self.request_as_argument = request_as_argument + + def get_dunder_attr(self, __dunder_name : str): + name = __dunder_name.strip('_') + name = 'obj_name' if name == 'name' else name + return getattr(self, name) -@dataclass(frozen = True) -class ProxyResourceData: +@dataclass +class ServerSentEvent(SerializableDataclass): """ - Used by Proxy objects to fill attributes & methods in a proxy class. + event name and socket address of events to be consumed by clients. + + Attributes + ---------- + name : str + name of the event, must be unique + obj_name: str + name of the event variable used to populate the RPC client + socket_address : str + address of the socket + unique_identifier: str + unique ZMQ identifier used in PUB-SUB model + what: str, default EVENT + is it a property, method/action or event? """ - what : str - instruction : str - module : typing.Union[str, None] name : str - qualname : str - doc : typing.Union[str, None] - kwdefaults : typing.Any - defaults : typing.Any + obj_name : str + unique_identifier : str + socket_address : str = field(default=UNSPECIFIED) + what : str = field(default=ResourceTypes.EVENT) - def json(self): - return asdict(self) - @dataclass -class GUIResources: +class GUIResources(SerializableDataclass): + """ + Encapsulation of all information required to populate hololinked-portal GUI for a thing. + + Attributes + ---------- instance_name : str - events : typing.Dict[str, typing.Any] + instance name of the ``Thing`` + inheritance : List[str] + inheritance tree of the ``Thing`` + classdoc : str + class docstring + properties : nested JSON (dictionary) + defined properties and their metadata + actions : nested JSON (dictionary) + defined actions + events : nested JSON (dictionary) + defined events + documentation : Dict[str, str] + documentation files, name as key and path as value + GUI : nested JSON (dictionary) + generated from ``hololinked.webdashboard.ReactApp``, a GUI can be shown under 'default GUI' tab in the portal + """ + instance_name : str + inheritance : typing.List[str] classdoc : typing.Optional[typing.List[str]] - inheritance : typing.List - GUI : typing.Optional[typing.Dict] = field( default=None ) - methods : typing.Dict[str, typing.Any] = field( default_factory=dict ) - parameters : typing.Dict[str, typing.Any] = field( default_factory=dict ) - documentation : typing.Optional[typing.Dict[str, typing.Any]] = field( default=None ) - - def json(self): - return asdict(self) \ No newline at end of file + properties : typing.Dict[str, typing.Any] = field(default_factory=dict) + actions : typing.Dict[str, typing.Any] = field(default_factory=dict) + events : typing.Dict[str, typing.Any] = field(default_factory=dict) + documentation : typing.Optional[typing.Dict[str, typing.Any]] = field(default=None) + GUI : typing.Optional[typing.Dict] = field(default=None) + + def __init__(self): + """ + initialize first, then call build action. + """ + super(SerializableDataclass, self).__init__() + + def build(self, instance): + from .thing import Thing + assert isinstance(instance, Thing), f"got invalid type {type(instance)}" + + self.instance_name = instance.instance_name + self.inheritance = [class_.__name__ for class_ in instance.__class__.mro()] + self.classdoc = instance.__class__.__doc__.splitlines() if instance.__class__.__doc__ is not None else None + self.GUI = instance.GUI + self.events = { + event._unique_identifier.decode() : dict( + name = event.name, + instruction = event._unique_identifier.decode(), + owner = event.owner.__class__.__name__, + owner_instance_name = event.owner.instance_name, + address = instance.event_publisher.socket_address + ) for event in instance.event_publisher.events + } + self.actions = dict() + self.properties = dict() + + for instruction, remote_info in instance.instance_resources.items(): + if remote_info.isaction: + try: + self.actions[instruction] = instance.rpc_resources[instruction].json() + self.actions[instruction]["remote_info"] = instance.httpserver_resources[instruction].json() + self.actions[instruction]["remote_info"]["http_method"] = instance.httpserver_resources[instruction].instructions.supported_methods() + # to check - apparently the recursive json() calling does not reach inner depths of a dict, + # therefore we call json ourselves + self.actions[instruction]["owner"] = instance.rpc_resources[instruction].qualname.split('.')[0] + self.actions[instruction]["owner_instance_name"] = remote_info.bound_obj.instance_name + self.actions[instruction]["type"] = 'classmethod' if isinstance(remote_info.obj, classmethod) else '' + self.actions[instruction]["signature"] = get_signature(remote_info.obj)[0] + except KeyError: + pass + elif remote_info.isproperty: + path_without_RW = instruction.rsplit('/', 1)[0] + if path_without_RW not in self.properties: + self.properties[path_without_RW] = instance.__class__.properties.webgui_info(remote_info.obj)[remote_info.obj.name] + self.properties[path_without_RW]["remote_info"] = self.properties[path_without_RW]["remote_info"].json() + self.properties[path_without_RW]["instruction"] = path_without_RW + self.properties[path_without_RW]["remote_info"]["http_method"] = instance.httpserver_resources[path_without_RW].instructions.supported_methods() + """ + The instruction part has to be cleaned up to be called as fullpath. Setting the full path back into + remote_info is not correct because the unbound method is used by multiple instances. + """ + self.properties[path_without_RW]["owner_instance_name"] = remote_info.bound_obj.instance_name + return self + + + +def get_organised_resources(instance): + """ + organise the exposed attributes, actions and events into the dataclasses defined above + so that the specific servers and event loop can use them. + """ + from .thing import Thing + from .events import Event + from .property import Property + + assert isinstance(instance, Thing), f"got invalid type {type(instance)}" + + httpserver_resources = dict() # type: typing.Dict[str, HTTPResource] + # The following dict will be given to the object proxy client + rpc_resources = dict() # type: typing.Dict[str, RPCResource] + # The following dict will be used by the event loop + instance_resources = dict() # type: typing.Dict[str, RemoteResource] + # create URL prefix + if instance._owner is not None: + instance._full_URL_path_prefix = f'{instance._owner._full_URL_path_prefix}/{instance.instance_name}' + else: + instance._full_URL_path_prefix = f'/{instance.instance_name}' # leading '/' was stripped at init + + # First add methods and callables + # properties + for prop in instance.parameters.descriptors.values(): + if isinstance(prop, Property) and hasattr(prop, '_remote_info') and prop._remote_info is not None: + if not isinstance(prop._remote_info, RemoteResourceInfoValidator): + raise TypeError("instance member {} has unknown sub-member '_remote_info' of type {}.".format( + prop, type(prop._remote_info))) + # above condition is just a gaurd in case somebody does some unpredictable patching activities + remote_info = prop._remote_info + fullpath = f"{instance._full_URL_path_prefix}{remote_info.URL_path}" + read_http_method, write_http_method = remote_info.http_method + + httpserver_resources[fullpath] = HTTPResource( + what=ResourceTypes.PROPERTY, + instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name, + obj_name=remote_info.obj_name, + fullpath=fullpath, + request_as_argument=False, + argument_schema=remote_info.argument_schema, + return_value_schema=remote_info.return_value_schema, + **{ + read_http_method : f"{fullpath}/read", + write_http_method : f"{fullpath}/write" + } + ) + rpc_resources[fullpath] = RPCResource( + what=ResourceTypes.PROPERTY, + instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name, + instruction=fullpath, + doc=prop.__doc__, + obj_name=remote_info.obj_name, + qualname=instance.__class__.__name__ + '.' + remote_info.obj_name, + # qualname is not correct probably, does not respect inheritance + top_owner=instance._owner is None, + argument_schema=remote_info.argument_schema, + return_value_schema=remote_info.return_value_schema + ) + data_cls = remote_info.to_dataclass(obj=prop, bound_obj=instance) + instance_resources[f"{fullpath}/read"] = data_cls + instance_resources[f"{fullpath}/write"] = data_cls + # instance_resources[f"{fullpath}/delete"] = data_cls + if prop.observable: + event_data_cls = ServerSentEvent( + name=prop._observable_event.name, + obj_name='_observable_event', # not used in client, so fill it with something + what=ResourceTypes.EVENT, + unique_identifier=f"{instance._full_URL_path_prefix}{prop._observable_event.URL_path}", + ) + prop._observable_event._remote_info = event_data_cls + httpserver_resources[fullpath] = event_data_cls + # Methods + for name, resource in inspect._getmembers(instance, inspect.ismethod, getattr_without_descriptor_read): + if hasattr(resource, '_remote_info'): + if not isinstance(resource._remote_info, RemoteResourceInfoValidator): + raise TypeError("instance member {} has unknown sub-member '_remote_info' of type {}.".format( + resource, type(resource._remote_info))) + remote_info = resource._remote_info + # methods are already bound + assert remote_info.isaction, ("remote info from inspect.ismethod is not a callable", + "logic error - visit https://github.com/VigneshVSV/hololinked/issues to report") + if len(remote_info.http_method) > 1: + raise ValueError(f"methods support only one HTTP method at the moment. Given number of methods : {len(remote_info.http_method)}.") + fullpath = f"{instance._full_URL_path_prefix}{remote_info.URL_path}" + instruction = f"{fullpath}/{remote_info.http_method[0]}" + httpserver_resources[instruction] = HTTPResource( + what=ResourceTypes.ACTION, + instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name, + obj_name=remote_info.obj_name, + fullpath=fullpath, + request_as_argument=remote_info.request_as_argument, + argument_schema=remote_info.argument_schema, + return_value_schema=remote_info.return_value_schema, + **{ remote_info.http_method[0] : instruction }, + ) + rpc_resources[instruction] = RPCResource( + what=ResourceTypes.ACTION, + instance_name=instance._owner.instance_name if instance._owner is not None else instance.instance_name, + instruction=instruction, + obj_name=getattr(resource, '__name__'), + qualname=getattr(resource, '__qualname__'), + doc=getattr(resource, '__doc__'), + top_owner=instance._owner is None, + argument_schema=remote_info.argument_schema, + return_value_schema=remote_info.return_value_schema, + request_as_argument=remote_info.request_as_argument + ) + instance_resources[instruction] = remote_info.to_dataclass(obj=resource, bound_obj=instance) + # Events + for name, resource in inspect._getmembers(instance, lambda o : isinstance(o, Event), getattr_without_descriptor_read): + assert isinstance(resource, Event), ("thing event query from inspect.ismethod is not an Event", + "logic error - visit https://github.com/VigneshVSV/hololinked/issues to report") + # above assertion is only a typing convenience + resource._owner = instance + fullpath = f"{instance._full_URL_path_prefix}{resource.URL_path}" + resource._unique_identifier = bytes(fullpath, encoding='utf-8') + data_cls = ServerSentEvent( + name=resource.name, + obj_name=name, + what=ResourceTypes.EVENT, + unique_identifier=f"{instance._full_URL_path_prefix}{resource.URL_path}", + ) + resource._remote_info = data_cls + httpserver_resources[fullpath] = data_cls + rpc_resources[fullpath] = data_cls + # Other objects + for name, resource in inspect._getmembers(instance, lambda o : isinstance(o, Thing), getattr_without_descriptor_read): + assert isinstance(resource, Thing), ("thing children query from inspect.ismethod is not a Thing", + "logic error - visit https://github.com/VigneshVSV/hololinked/issues to report") + # above assertion is only a typing convenience + if name == '_owner' or resource._owner is not None: + # second condition allows sharing of Things without adding once again to the list of exposed resources + # for example, a shared logger + continue + resource._owner = instance + httpserver_resources.update(resource.httpserver_resources) + # rpc_resources.update(resource.rpc_resources) + instance_resources.update(resource.instance_resources) + + # The above for-loops can be used only once, the division is only for readability + # following are in _internal_fixed_attributes - allowed to set only once + return rpc_resources, httpserver_resources, instance_resources \ No newline at end of file diff --git a/hololinked/server/database.py b/hololinked/server/database.py index ceecd8f..1d863b9 100644 --- a/hololinked/server/database.py +++ b/hololinked/server/database.py @@ -1,49 +1,465 @@ +import os +import threading import typing +from sqlalchemy import create_engine, select, inspect as inspect_database from sqlalchemy.ext import asyncio as asyncio_ext from sqlalchemy.orm import sessionmaker -from sqlalchemy import create_engine +from sqlalchemy import Integer, String, JSON, LargeBinary +from sqlalchemy.orm import Mapped, mapped_column, DeclarativeBase, MappedAsDataclass +from sqlite3 import DatabaseError +from dataclasses import dataclass -from .serializers import JSONSerializer, BaseSerializer +from ..param import Parameterized +from .constants import JSONSerializable +from .config import global_config +from .utils import pep8_to_dashed_URL +from .serializers import PythonBuiltinJSONSerializer as JSONSerializer, BaseSerializer +from .property import Property -def create_DB_URL(file_name : str, asynch : bool = False): - if file_name.endswith('.json'): - file = open(file_name, 'r') - conf = JSONSerializer.general_load(file) - else: - raise ValueError("config files of extension - {} expected, given file name {}".format(["json"], file_name)) + +class ThingTableBase(DeclarativeBase): + """SQLAlchemy base table for all thing related tables""" + pass - host = conf.get("host", 'localhost') - port = conf.get("port", 5432) - user = conf.get('user', 'postgres') - password = conf.get('password', '') +class SerializedProperty(MappedAsDataclass, ThingTableBase): + """ + Property value is serialized before storing in database, therefore providing unified version for + SQLite and other relational tables + """ + __tablename__ = "properties" + + instance_name : Mapped[str] = mapped_column(String) + name : Mapped[str] = mapped_column(String, primary_key=True) + serialized_value : Mapped[bytes] = mapped_column(LargeBinary) + + +class ThingInformation(MappedAsDataclass, ThingTableBase): + __tablename__ = "things" + + instance_name : Mapped[str] = mapped_column(String, primary_key=True) + class_name : Mapped[str] = mapped_column(String) + script : Mapped[str] = mapped_column(String) + kwargs : Mapped[JSONSerializable] = mapped_column(JSON) + eventloop_instance_name : Mapped[str] = mapped_column(String) + http_server : Mapped[str] = mapped_column(String) + level : Mapped[int] = mapped_column(Integer) + level_type : Mapped[str] = mapped_column(String) # starting local to computer or global to system? + + def json(self): + return { + "instance_name" : self.instance_name, + "class_name" : self.class_name, + "script" : self.script, + "kwargs" : self.kwargs, + "eventloop_instance_name" : self.eventloop_instance_name, + "http_server" : self.http_server, + "level" : self.level, + "level_type" : self.level_type + } - if asynch: - return f"postgresql+asyncpg://{user}:{password}@{host}:{port}" - else: - return f"postgresql://{user}:{password}@{host}:{port}" - + +@dataclass +class DeserializedProperty: # not part of database + """ + Property with deserialized value + """ + instance_name : str + name : str + value : typing.Any + + + +class BaseDB: + """ + Implements configuration file reader for all irrespective sync or async DB operation + """ + + def __init__(self, instance : Parameterized, serializer : typing.Optional[BaseSerializer] = None, + config_file : typing.Union[str, None] = None) -> None: + self.thing_instance = instance + self.instance_name = instance.instance_name + self.serializer = serializer + self.URL = self.create_URL(config_file) + self._batch_call_context = {} + + + @classmethod + def load_conf(cls, config_file : str) -> typing.Dict[str, typing.Any]: + """ + load configuration file using JSON serializer + """ + if not config_file: + conf = {} + elif config_file.endswith('.json'): + file = open(config_file, 'r') + conf = JSONSerializer.load(file) + else: + raise ValueError("config files of extension - {} expected, given file name {}".format(["json"], config_file)) + return conf + + def create_URL(self, config_file : str) -> str: + """ + auto chooses among the different supported databases based on config file and creates the DB URL + """ + if config_file is None: + folder = f'{global_config.TEMP_DIR}{os.sep}databases{os.sep}{pep8_to_dashed_URL(self.thing_instance.__class__.__name__.lower())}' + if not os.path.exists(folder): + os.makedirs(folder) + return BaseDB.create_sqlite_URL(**dict(file=f'{folder}{os.sep}{self.instance_name}.db')) + conf = BaseDB.load_conf(config_file) + if conf.get('server', None): + return BaseDB.create_postgres_URL(conf=conf) + else: + return BaseDB.create_sqlite_URL(conf=conf) + @classmethod + def create_postgres_URL(cls, conf : str = None, database : typing.Optional[str] = None, + use_dialect : typing.Optional[bool] = False) -> str: + """ + create a postgres URL + """ + server = conf.get('server', None) + database = conf.get('database', database) + host = conf.get("host", 'localhost') + port = conf.get("port", 5432) + user = conf.get('user', 'postgres') + password = conf.get('password', '') + if use_dialect: + dialect = conf.get('dialect', None) + if dialect: + return f"{server}+{dialect}://{user}:{password}@{host}:{port}/{database}" + return f"{server}://{user}:{password}@{host}:{port}/{database}" + + @classmethod + def create_sqlite_URL(self, **conf : typing.Dict[str, JSONSerializable]) -> str: + """ + create sqlite URL + """ + in_memory = conf.get('in_memory', False) + dialect = conf.get('dialect', 'pysqlite') + if not in_memory: + file = conf.get('file', f'{global_config.TEMP_DIR}{os.sep}databases{os.sep}default.db') + return f"sqlite+{dialect}:///{file}" + else: + return f"sqlite+{dialect}:///:memory:" + + @property + def in_batch_call_context(self): + return threading.get_ident() in self._batch_call_context -class BaseAsyncDB: + +class BaseAsyncDB(BaseDB): + """ + Base class for an async database engine, implements configuration file reader, + sqlalchemy engine & session creation. Set ``async_db_engine`` boolean flag to True in ``Thing`` class + to use this engine. Database operations are then scheduled in the event loop instead of blocking the current thread. + Scheduling happens after properties are set/written. + + Parameters + ---------- + database: str + The database to open in the database server specified in config_file (see below) + serializer: BaseSerializer + The serializer to use for serializing and deserializing data (for example + property serializing before writing to database). Will be the same as rpc_serializer supplied to ``Thing``. + config_file: str + absolute path to database server configuration file + """ - def __init__(self, database : str, serializer : BaseSerializer, config_file : typing.Union[str, None] = None) -> None: - if config_file: - URL = f"{create_DB_URL(config_file, True)}/{database}" - self.engine = asyncio_ext.create_async_engine(URL, echo = True) - self.async_session = sessionmaker(self.engine, expire_on_commit=True, class_= asyncio_ext.AsyncSession) # type: ignore - self.serializer = serializer + def __init__(self, instance : Parameterized, + serializer : typing.Optional[BaseSerializer] = None, + config_file : typing.Union[str, None] = None) -> None: + super().__init__(instance=instance, serializer=serializer, config_file=config_file) + self.engine = asyncio_ext.create_async_engine(self.URL, echo=True) + self.async_session = sessionmaker(self.engine, expire_on_commit=True, + class_=asyncio_ext.AsyncSession) + ThingTableBase.metadata.create_all(self.engine) -class BaseSyncDB: - def __init__(self, database : str, serializer : BaseSerializer, config_file : typing.Union[str, None] = None) -> None: - if config_file: - URL = f"{create_DB_URL(config_file, False)}/{database}" - self.engine = create_engine(URL, echo = True) - self.sync_session = sessionmaker(self.engine, expire_on_commit=True) - self.serializer = serializer +class BaseSyncDB(BaseDB): + """ + Base class for an synchronous (blocking) database engine, implements configuration file reader, sqlalchemy engine + & session creation. Default DB engine for ``Thing`` & called immediately after properties are set/written. + + Parameters + ---------- + database: str + The database to open in the database server specified in config_file (see below) + serializer: BaseSerializer + The serializer to use for serializing and deserializing data (for example + property serializing into database for storage). Will be the same as + rpc_serializer supplied to ``Thing``. + config_file: str + absolute path to database server configuration file + """ + + def __init__(self, instance : Parameterized, + serializer : typing.Optional[BaseSerializer] = None, + config_file : typing.Union[str, None] = None) -> None: + super().__init__(instance=instance, serializer=serializer, config_file=config_file) + self.engine = create_engine(self.URL, echo=True) + self.sync_session = sessionmaker(self.engine, expire_on_commit=True) + ThingTableBase.metadata.create_all(self.engine) + + + +class ThingDB(BaseSyncDB): + """ + Database engine composed within ``Thing``, carries out database operations like storing object information, properties + etc. + + Parameters + ---------- + instance_name: str + ``instance_name`` of the ``Thing`` + serializer: BaseSerializer + serializer used by the ``Thing``. The serializer to use for serializing and deserializing data (for example + property serializing into database for storage). + config_file: str + configuration file of the database server + """ + + def fetch_own_info(self): # -> ThingInformation: + """ + fetch ``Thing`` instance's own information (some useful metadata which helps the ``Thing`` run). + + Returns + ------- + ``ThingInformation`` + """ + if not inspect_database(self.engine).has_table("things"): + return + with self.sync_session() as session: + stmt = select(ThingInformation).filter_by(instance_name=self.instance_name) + data = session.execute(stmt) + data = data.scalars().all() + if len(data) == 0: + return None + elif len(data) == 1: + return data[0] + else: + raise DatabaseError("Multiple things with same instance name found, either cleanup database/detach/make new") + + + def get_property(self, property : typing.Union[str, Property], deserialized : bool = True) -> typing.Any: + """ + fetch a single property. + + Parameters + ---------- + property: str | Property + string name or descriptor object + deserialized: bool, default True + deserialize the property if True + + Returns + ------- + value: Any + property value + """ + with self.sync_session() as session: + name = property if isinstance(property, str) else property.name + stmt = select(SerializedProperty).filter_by(instance_name=self.instance_name, name=name) + data = session.execute(stmt) + prop = data.scalars().all() # type: typing.Sequence[SerializedProperty] + if len(prop) == 0: + raise DatabaseError(f"property {name} not found in database") + elif len(prop) > 1: + raise DatabaseError("multiple properties with same name found") # Impossible actually + if not deserialized: + return prop[0] + return self.serializer.loads(prop[0].serialized_value) + + + def set_property(self, property : typing.Union[str, Property], value : typing.Any) -> None: + """ + change the value of an already existing property. + + Parameters + ---------- + property: str | Property + string name or descriptor object + value: Any + value of the property + """ + if self.in_batch_call_context: + self._batch_call_context[threading.get_ident()][property.name] = value + return + with self.sync_session() as session: + name = property if isinstance(property, str) else property.name + stmt = select(SerializedProperty).filter_by(instance_name=self.instance_name, + name=name) + data = session.execute(stmt) + prop = data.scalars().all() + if len(prop) > 1: + raise DatabaseError("multiple properties with same name found") # Impossible actually + if len(prop) == 1: + prop = prop[0] + prop.serialized_value = self.serializer.dumps(value) + else: + prop = SerializedProperty( + instance_name=self.instance_name, + name=name, + serialized_value=self.serializer.dumps(getattr(self.thing_instance, name)) + ) + session.add(prop) + session.commit() + + + def get_properties(self, properties : typing.Dict[typing.Union[str, Property], typing.Any], + deserialized : bool = True) -> typing.Dict[str, typing.Any]: + """ + get multiple properties at once. + + Parameters + ---------- + properties: List[str | Property] + string names or the descriptor of the properties as a list + + Returns + ------- + value: Dict[str, Any] + property names and values as items + """ + with self.sync_session() as session: + names = [] + for obj in properties.keys(): + names.append(obj if isinstance(obj, str) else obj.name) + stmt = select(SerializedProperty).filter_by(instance_name=self.instance_name).filter( + SerializedProperty.name.in_(names)) + data = session.execute(stmt) + unserialized_props = data.scalars().all() + props = dict() + for prop in unserialized_props: + props[prop.name] = prop.serialized_value if not deserialized else self.serializer.loads(prop.serialized_value) + return props + + + def set_properties(self, properties : typing.Dict[typing.Union[str, Property], typing.Any]) -> None: + """ + change the values of already existing few properties at once + + Parameters + ---------- + properties: Dict[str | Property, Any] + string names or the descriptor of the property and any value as dictionary pairs + """ + if self.in_batch_call_context: + for obj, value in properties.items(): + name = obj if isinstance(obj, str) else obj.name + self._batch_call_context[threading.get_ident()][name] = value + return + with self.sync_session() as session: + names = [] + for obj in properties.keys(): + names.append(obj if isinstance(obj, str) else obj.name) + stmt = select(SerializedProperty).filter_by(instance_name=self.instance_name).filter( + SerializedProperty.name.in_(names)) + data = session.execute(stmt) + db_props = data.scalars().all() + for obj, value in properties.items(): + name = obj if isinstance(obj, str) else obj.name + db_prop = list(filter(lambda db_prop: db_prop.name == name, db_props)) # type: typing.List[SerializedProperty] + if len(prop) > 1: + raise DatabaseError("multiple properties with same name found") # Impossible actually + if len(db_prop) == 1: + db_prop = db_prop[0] # type: SerializedProperty + db_prop.serialized_value = self.serializer.dumps(value) + else: + prop = SerializedProperty( + instance_name=self.instance_name, + name=name, + serialized_value=self.serializer.dumps(value) + ) + session.add(prop) + session.commit() + + + def get_all_properties(self, deserialized : bool = True) -> typing.Dict[str, typing.Any]: + """ + read all properties of the ``Thing`` instance. + + Parameters + ---------- + deserialized: bool, default True + deserilize the properties if True + """ + with self.sync_session() as session: + stmt = select(SerializedProperty).filter_by(instance_name=self.instance_name) + data = session.execute(stmt) + existing_props = data.scalars().all() #type: typing.Sequence[SerializedProperty] + if not deserialized: + return existing_props + props = dict() + for prop in existing_props: + props[prop.name] = self.serializer.loads(prop.serialized_value) + return props + + + def create_missing_properties(self, properties : typing.Dict[str, Property], + get_missing_property_names : bool = False) -> None: + """ + create any and all missing properties of ``Thing`` instance + in database. + + Parameters + ---------- + properties: Dict[str, Property] + descriptors of the properties + + Returns + ------- + List[str] + list of missing properties if get_missing_propertys is True + """ + missing_props = [] + with self.sync_session() as session: + existing_props = self.get_all_properties() + for name, new_prop in properties.items(): + if name not in existing_props: + prop = SerializedProperty( + instance_name=self.instance_name, + name=new_prop.name, + serialized_value=self.serializer.dumps(getattr(self.thing_instance, + new_prop.name)) + ) + session.add(prop) + missing_props.append(name) + session.commit() + if get_missing_property_names: + return missing_props + + + +class batch_db_commit: + """ + Context manager to write multiple properties to database at once. Useful for sequential sets/writes of multiple properties + which has db_commit or db_persist set to True, but only write their values to database at once. + """ + def __init__(self, db_engine : ThingDB) -> None: + self.db_engine = db_engine + + def __enter__(self) -> None: + self.db_engine._context[threading.get_ident()] = dict() + + def __exit__(self, exc_type, exc_value, exc_tb) -> None: + data = self.db_engine._context.pop(threading.get_ident(), dict()) # typing.Dict[str, typing.Any] + if exc_type is None: + self.db_engine.set_properties(data) + return + for name, value in data.items(): + try: + self.db_engine.set_property(name, value) + except Exception as ex: + pass -__all__ = ['BaseAsyncDB'] \ No newline at end of file +__all__ = [ + BaseAsyncDB.__name__, + BaseSyncDB.__name__, + ThingDB.__name__, + batch_db_commit.__name__ +] \ No newline at end of file diff --git a/hololinked/server/decorators.py b/hololinked/server/decorators.py deleted file mode 100644 index b2ec855..0000000 --- a/hololinked/server/decorators.py +++ /dev/null @@ -1,223 +0,0 @@ -from types import FunctionType -from inspect import iscoroutinefunction, getfullargspec -from typing import Any, Optional, Union, Callable -import typing -from enum import Enum -from functools import wraps -from dataclasses import dataclass, asdict, field, fields - - -from .data_classes import ScadaInfoValidator, ScadaInfoData -from .constants import (USE_OBJECT_NAME, UNSPECIFIED, GET, POST, PUT, DELETE, PATCH, WRAPPER_ASSIGNMENTS) -from .utils import wrap_text -from .path_converter import compile_path - - - -def wrap_method(method : FunctionType): - """wraps a methods with useful operations before and after calling a method. - Old : use case not decided. - - Args: - method (FunctionType): function or callable - - Returns: - method : returns a wrapped callable, preserving information regarding signature - as much as possible - """ - - @wraps(method, WRAPPER_ASSIGNMENTS) - def wrapped_method(*args, **kwargs) -> Any: - self = args[0] - self.logger.debug("called {} of instance {}".format(method.__qualname__, self.instance_name)) - return method(*args, **kwargs) - return wrapped_method - - -def is_private_attribute(attr_name: str) -> bool: - """returns if the attribute name is to be considered private or not - Args: - attr_name (str): name of the attribute - - Returns: - bool: return True when attribute does not start with '_' or (dunder '__' - are therefore included) - """ - if attr_name.startswith('_'): - return True - return False - - -def remote_method(URL_path : str = USE_OBJECT_NAME, http_method : str = POST, - state : Optional[Union[str, Enum]] = None) -> Callable: - """Use this function to decorate your methods to be accessible remotely. - - Args: - URL_path (str, optional): The path of URL under which the object is accessible. defaults to name of the object. - http_method (str, optional) : HTTP method (GET, POST, PUT etc.). defaults to POST. - state (Union[str, Tuple[str]], optional): state under which the object can executed or written. When not provided, - its accessible or can be executed under any state. - - Returns: - Callable: returns the callable object as it is or wrapped within loggers - """ - - def inner(obj): - original = obj - if isinstance(obj, classmethod): - obj = obj.__func__ - if callable(obj): - if hasattr(obj, 'scada_info') and not isinstance(obj.scada_info, ScadaInfoValidator): - raise NameError( - wrap_text( - """ - variable name 'scada_info' reserved for scadapy library. - Please do not assign this variable to any other object except scadapy.server.scada_info.ScadaInfoValidator. - """ - ) - ) - else: - obj.scada_info = ScadaInfoValidator() - obj_name = obj.__qualname__.split('.') - if len(obj_name) > 1: # i.e. its a bound method, used by RemoteObject - if URL_path == USE_OBJECT_NAME: - obj.scada_info.URL_path = f'/{obj_name[1]}' - else: - assert URL_path.startswith('/'), f"URL_path should start with '/', please add '/' before '{URL_path}'" - obj.scada_info.URL_path = URL_path - obj.scada_info.obj_name = obj_name[1] - elif len(obj_name) == 1 and isinstance(obj, FunctionType): # normal unbound function - used by HTTPServer instance - if URL_path is USE_OBJECT_NAME: - obj.scada_info.URL_path = '/{}'.format(obj_name[0]) - else: - assert URL_path.startswith('/'), f"URL_path should start with '/', please add '/' before '{URL_path}'" - obj.scada_info.URL_path = URL_path - obj.scada_info.obj_name = obj_name[0] - else: - raise RuntimeError(f"Undealt option for decorating {obj} or decorators wrongly used") - if http_method is not UNSPECIFIED: - if isinstance(http_method, str): - obj.scada_info.http_method = (http_method,) - else: - obj.scada_info.http_method = http_method - if state is not None: - if isinstance(state, (Enum, str)): - obj.scada_info.state = (state,) - else: - obj.scada_info.state = state - if 'request' in getfullargspec(obj).kwonlyargs: - obj.scada_info.http_request_as_argument = True - obj.scada_info.iscallable = True - obj.scada_info.iscoroutine = iscoroutinefunction(obj) - return original - else: - raise TypeError( - wrap_text( - f""" - target for get()/post()/remote_method() or http method decorator is not a function/method. - Given type {type(obj)} - """ - ) - ) - return inner - - -def remote_parameter(**kwargs): - from .remote_parameter import RemoteParameter - return RemoteParameter(*kwargs) - - -def get(URL_path = USE_OBJECT_NAME): - """ - use it on RemoteObject subclass methods to be available with GET HTTP request. - method is also by default accessible to proxy clients. - """ - return remote_method(URL_path=URL_path, http_method=GET) - -def post(URL_path = USE_OBJECT_NAME): - """ - use it on RemoteObject subclass methods to be available with POST HTTP request. - method is also by default accessible to proxy clients. - """ - return remote_method(URL_path=URL_path, http_method=POST) - -def put(URL_path = USE_OBJECT_NAME): - """ - use it on RemoteObject subclass methods to be available with PUT HTTP request. - method is also by default accessible to proxy clients. - """ - return remote_method(URL_path=URL_path, http_method=PUT) - -def delete(URL_path = USE_OBJECT_NAME): - """ - use it on RemoteObject subclass methods to be available with DELETE HTTP request. - method is also by default accessible to proxy clients. - """ - return remote_method(URL_path=URL_path, http_method=DELETE) - -def patch(URL_path = USE_OBJECT_NAME): - """ - use it on RemoteObject subclass methods to be available with PATCH HTTP request. - method is also by default accessible to proxy clients. - """ - return remote_method(URL_path=URL_path, http_method=PATCH) - - -@dataclass -class FuncInfo: - module : str - name : str - qualname : str - doc : str - kwdefaults : Any - defaults : Any - scadapy : ScadaInfoData - - def json(self): - return asdict(self) - - -# @dataclass -# class DB_registration_info: -# script : str -# instance_name : str -# http_server : str = field(default = '') -# args : Tuple[Any] = field(default = tuple()) -# kwargs : Dict[str, Any] = field(default = dict()) -# eventloop : str = field(default = '') -# level : int = field(default = 1) -# level_type : str = field(default = '') - - -# def parse_request_args(*args, method : str): -# """ -# This method is useful when linters figure out conditional returns on decorators -# """ -# arg_len = len(args) -# if arg_len > 2 or arg_len == 0: -# raise ValueError( -# """ -# method {}() accepts only two argument, URL and/or a function/method. -# Given length of arguments : {}. -# """.format(method.lower(), arg_len) -# ) -# if isinstance(args[0], FunctionType): -# target = args[0] -# elif len(args) > 1 and isinstance(args[1], FunctionType): -# target = args[1] -# else: -# target = None -# if isinstance(args[0], str): -# URL = args[0] -# elif len(args) > 1 and isinstance(args[1], str): -# URL = args[1] -# else: -# URL = USE_OBJECT_NAME -# return target, URL - - - - -__all__ = ['get', 'put', 'post', 'delete', 'patch', 'remote_method', 'remote_parameter'] - - diff --git a/hololinked/server/eventloop.py b/hololinked/server/eventloop.py index 245bf7e..2d6a105 100644 --- a/hololinked/server/eventloop.py +++ b/hololinked/server/eventloop.py @@ -1,263 +1,351 @@ +import sys +import os +import warnings import subprocess import asyncio -import traceback import importlib import typing +import threading +import logging +from uuid import uuid4 - -from .utils import unique_id, wrap_text -from .constants import * -from .remote_parameters import TypedDict +from .constants import HTTP_METHODS +from .utils import format_exception_as_json +from .config import global_config +from .zmq_message_brokers import ServerTypes from .exceptions import * -from .decorators import post, get -from .remote_object import * -from .zmq_message_brokers import AsyncPollingZMQServer, ZMQServerPool, ServerTypes -from .remote_parameter import RemoteParameter -from ..param.parameters import Boolean, ClassSelector, TypedList, List as PlainList +from .thing import Thing, ThingMeta +from .property import Property +from .properties import ClassSelector, TypedList, List, Boolean, TypedDict +from .action import action as remote_method +from .logger import ListHandler + + +def set_event_loop_policy(): + if sys.platform.lower().startswith('win'): + asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) + + if global_config.USE_UVLOOP: + if sys.platform.lower() in ['linux', 'darwin', 'linux2']: + import uvloop + asyncio.set_event_loop_policy(uvloop.EventLoopPolicy()) + else: + warnings.warn("uvloop not supported for windows, using default windows selector loop.", RuntimeWarning) +set_event_loop_policy() class Consumer: - consumer = ClassSelector(default=None, allow_None=True, class_=RemoteObject, isinstance=False) - args = PlainList(default=None, allow_None=True, accept_tuple=True) - kwargs = TypedDict(default=None, allow_None=True, key_type=str) + """ + Container class for Thing to pass to eventloop for multiprocessing applications in case + of rare needs. + """ + object_cls = ClassSelector(default=None, allow_None=True, class_=Thing, isinstance=False, + remote=False) + args = List(default=None, allow_None=True, accept_tuple=True, remote=False) + kwargs = TypedDict(default=None, allow_None=True, key_type=str, remote=False) - def __init__(self, consumer : typing.Type[RemoteObject], args : typing.Tuple = tuple(), **kwargs) -> None: - if consumer is not None: - self.consumer = consumer - else: - raise ValueError("consumer cannot be None, please assign a subclass of RemoteObject") + def __init__(self, object_cls : typing.Type[Thing], args : typing.Tuple = tuple(), **kwargs) -> None: + self.object_cls = object_cls self.args = args self.kwargs = kwargs +RemoteObject = Thing # reading convenience class EventLoop(RemoteObject): """ The EventLoop class implements a infinite loop where zmq ROUTER sockets listen for messages. Each consumer of the - event loop (an instance of RemoteObject) listen on their own ROUTER socket and execute methods or allow read and write + event loop (an instance of Thing) listen on their own ROUTER socket and execute methods or allow read and write of attributes upon receiving instructions. Socket listening is implemented in an async (asyncio) fashion. """ server_type = ServerTypes.EVENTLOOP - remote_objects = TypedList(item_type=(RemoteObject, Consumer), bounds=(0,100), allow_None=True, default=None, - doc="""list of RemoteObjects which are being executed""") - threaded = Boolean(default=False, doc="""by default False, set to True to use thread pool executor instead of - of simple asyncio. Default executor is a single-threaded asyncio loop. Thread pool executor creates - each thread with its own asyncio loop.""" ) - # Remote Parameters - uninstantiated_remote_objects = TypedDict(default=None, allow_None=True, key_type=str, - item_type=(Consumer, str)) #, URL_path = '/uninstantiated-remote-objects') - - def __new__(cls, **kwargs): - obj = super().__new__(cls, **kwargs) - obj._internal_fixed_attributes.append('_message_broker_pool') - return obj - - def __init__(self, *, instance_name : str, - remote_objects : typing.Union[RemoteObject, Consumer, typing.List[typing.Union[RemoteObject, Consumer]]] = list(), # type: ignore - requires covariant types - log_level : int = logging.INFO, **kwargs) -> None: - super().__init__(instance_name=instance_name, remote_objects=remote_objects, log_level=log_level, **kwargs) - self._message_broker_pool : ZMQServerPool = ZMQServerPool(instance_names=None, - # create empty pool as message brokers are already created - proxy_serializer=self.proxy_serializer, json_serializer=self.json_serializer) - remote_objects : typing.List[RemoteObject] = [self] - if self.remote_objects is not None: - for consumer in self.remote_objects: - if isinstance(consumer, RemoteObject): - remote_objects.append(consumer) - consumer.object_info.eventloop_name = self.instance_name - self._message_broker_pool.register_server(consumer.message_broker) + expose = Boolean(default=True, remote=False, + doc="""set to False to use the object locally to avoid alloting network resources + of your computer for this object""") + + things = TypedList(item_type=(Thing, Consumer), bounds=(0,100), allow_None=True, default=None, + doc="list of Things which are being executed", remote=False) #type: typing.List[Thing] + + threaded = Boolean(default=False, remote=False, + doc="set True to run each thing in its own thread") + + + def __init__(self, *, + instance_name : str, + things : typing.Union[Thing, Consumer, typing.List[typing.Union[Thing, Consumer]]] = list(), # type: ignore - requires covariant types + log_level : int = logging.INFO, + **kwargs + ) -> None: + """ + Parameters + ---------- + instance_name: str + instance name of the event loop + things: List[Thing] + things to be run/served + log_level: int + log level of the event loop logger + """ + super().__init__(instance_name=instance_name, things=things, log_level=log_level, **kwargs) + things = [] # type: typing.List[Thing] + if self.expose: + things.append(self) + if self.things is not None: + for consumer in self.things: + if isinstance(consumer, Thing): + things.append(consumer) + consumer.object_info.eventloop_instance_name = self.instance_name elif isinstance(consumer, Consumer): - instance = consumer.consumer(*consumer.args, **consumer.kwargs, - eventloop_name = self.instance_name) - self._message_broker_pool.register_server(instance.message_broker) - remote_objects.append(instance) - self.remote_objects = remote_objects # re-assign the instantiated objects as well - self.uninstantiated_remote_objects = {} - + instance = consumer.object_cls(*consumer.args, **consumer.kwargs, + eventloop_name=self.instance_name) + things.append(instance) + self.things = things # re-assign the instantiated objects as well + self.uninstantiated_things = dict() + self._message_listener_methods = [] + def __post_init__(self): super().__post_init__() self.logger.info("Event loop with name '{}' can be started using EventLoop.run().".format(self.instance_name)) return - @property - def message_broker_pool(self): - return self._message_broker_pool - - @get('/remote-objects') - def servers(self): - return { - instance.__class__.__name__ : instance.instance_name for instance in self.remote_objects - } - # example of overloading - @post('/exit') + @remote_method() def exit(self): + """ + Stops the event loop and all its things. Generally, this leads + to exiting the program unless some code follows the ``run()`` method. + """ raise BreakAllLoops + + uninstantiated_things = TypedDict(default=None, allow_None=True, key_type=str, + item_type=(Consumer, str), URL_path='/things/uninstantiated') + + @classmethod - def import_remote_object(cls, file_name : str, object_name : str): - module_name = file_name.split('\\')[-1] - spec = importlib.util.spec_from_file_location(module_name, file_name) # type: ignore + def _import_thing(cls, file_name : str, object_name : str): + """ + import a thing specified by ``object_name`` from its + script or module. + + Parameters + ---------- + file_name : str + file or module path + object_name : str + name of ``Thing`` class to be imported + """ + module_name = file_name.split(os.sep)[-1] + spec = importlib.util.spec_from_file_location(module_name, file_name) if spec is not None: - module = importlib.util.module_from_spec(spec) # type: ignore + module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) else: - module = importlib.import_module(module_name, file_name.split('\\')[0]) - + module = importlib.import_module(module_name, file_name.split(os.sep)[0]) consumer = getattr(module, object_name) - if issubclass(consumer, RemoteObject): + if issubclass(consumer, Thing): return consumer else: - raise ValueError(wrap_text(f"""object name {object_name} in {file_name} not a subclass of RemoteObject. - Only subclasses are accepted (not even instances). Given object : {consumer}""")) - - @post('/remote-object/import') - def _import_remote_object(self, file_name : str, object_name : str): - consumer = self.import_remote_object(file_name, object_name) - id = unique_id() - self.uninstantiated_remote_objects[id] = consumer - return dict( - id = id, - db_params = consumer.parameters.remote_objects_webgui_info(consumer.parameters.load_at_init_objects()) - ) - - @post('/remote-object/instantiate') - def instantiate(self, file_name : str, object_name : str, kwargs : typing.Dict = {}): - consumer = self.import_remote_object(file_name, object_name) - instance = consumer(**kwargs, eventloop_name = self.instance_name) - self.register_new_consumer(instance) + raise ValueError(f"object name {object_name} in {file_name} not a subclass of Thing.", + f" Only subclasses are accepted (not even instances). Given object : {consumer}") - def register_new_consumer(self, instance : RemoteObject): - zmq_server = AsyncPollingZMQServer(instance_name=instance.instance_name, server_type=ServerTypes.USER_REMOTE_OBJECT, - context=self.message_broker_pool.context, json_serializer=self.json_serializer, - proxy_serializer=self.proxy_serializer) - self.message_broker_pool.register_server(zmq_server) - instance.message_broker = zmq_server - self.remote_objects.append(instance) - async_loop = asyncio.get_event_loop() - async_loop.call_soon(lambda : asyncio.create_task(self.run_single_target(instance))) + + @remote_method(URL_path='/things', http_method=HTTP_METHODS.POST) + def import_thing(self, file_name : str, object_name : str): + """ + import thing from the specified path and return the default + properties to be supplied to instantiate the object. + """ + consumer = self._import_thing(file_name, object_name) # type: ThingMeta + id = uuid4() + self.uninstantiated_things[id] = consumer + return id + + + @remote_method(URL_path='/things/instantiate', + http_method=HTTP_METHODS.POST) # remember to pass schema with mandatory instance name + def instantiate(self, id : str, kwargs : typing.Dict = {}): + """ + Instantiate the thing that was imported with given arguments + and add to the event loop + """ + consumer = self.uninstantiated_things[id] + instance = consumer(**kwargs, eventloop_name=self.instance_name) # type: Thing + self.things.append(instance) + rpc_server = instance.rpc_server + self.request_listener_loop.call_soon(asyncio.create_task(lambda : rpc_server.poll())) + self.request_listener_loop.call_soon(asyncio.create_task(lambda : rpc_server.tunnel_message_to_things())) + if not self.threaded: + self.thing_executor_loop.call_soon(asyncio.create_task(lambda : self.run_single_target(instance))) + else: + _thing_executor = threading.Thread(target=self.run_thing_executor, args=([instance],)) + _thing_executor.start() def run(self): - async_loop = asyncio.get_event_loop() - # while True: - async_loop.run_until_complete( + """ + start the eventloop + """ + if not self.threaded: + _thing_executor = threading.Thread(target=self.run_things_executor, args=(self.things,)) + _thing_executor.start() + else: + for thing in self.things: + _thing_executor = threading.Thread(target=self.run_things_executor, args=([thing],)) + _thing_executor.start() + self.run_external_message_listener() + if not self.threaded: + _thing_executor.join() + + + @classmethod + def get_async_loop(cls): + """ + get or automatically create an asnyc loop for the current thread. + """ + try: + loop = asyncio.get_event_loop() + except RuntimeError: + loop = asyncio.new_event_loop() + # set_event_loop_policy() - why not? + asyncio.set_event_loop(loop) + return loop + + + def run_external_message_listener(self): + """ + Runs ZMQ's sockets which are visible to clients. + This method is automatically called by ``run()`` method. + Please dont call this method when the async loop is already running. + """ + self.request_listener_loop = self.get_async_loop() + rpc_servers = [thing.rpc_server for thing in self.things] + for rpc_server in rpc_servers: + self.request_listener_loop.call_soon(lambda : asyncio.create_task(rpc_server.poll())) + self.request_listener_loop.call_soon(lambda : asyncio.create_task(rpc_server.tunnel_message_to_things())) + self.logger.info("starting external message listener thread") + self.request_listener_loop.run_forever() + self.logger.info("exiting external listener event loop {}".format(self.instance_name)) + self.request_listener_loop.close() + + + def run_things_executor(self, things): + """ + Run ZMQ sockets which provide queued instructions to ``Thing``. + This method is automatically called by ``run()`` method. + Please dont call this method when the async loop is already running. + """ + thing_executor_loop = self.get_async_loop() + self.logger.info(f"starting thing executor loop in thread {threading.get_ident()} for {[obj.instance_name for obj in things]}") + thing_executor_loop.run_until_complete( asyncio.gather( *[self.run_single_target(instance) - for instance in self.remote_objects] + for instance in things] )) - self.logger.info("exiting event loop {}".format(self.instance_name)) - async_loop.close() + self.logger.info(f"exiting event loop in thread {threading.get_ident()}") + thing_executor_loop.close() + @classmethod - async def run_single_target(cls, instance : RemoteObject) -> None: + async def run_single_target(cls, instance : Thing) -> None: instance_name = instance.instance_name while True: instructions = await instance.message_broker.async_recv_instructions() for instruction in instructions: - client, _, client_type, _, msg_id, instruction_str, arguments, context = instruction - plain_reply = context.pop("plain_reply", False) + client, _, client_type, _, msg_id, _, instruction_str, arguments, context = instruction + oneway = context.pop('oneway', False) fetch_execution_logs = context.pop("fetch_execution_logs", False) - if not plain_reply and fetch_execution_logs: + if fetch_execution_logs: list_handler = ListHandler([]) list_handler.setLevel(logging.DEBUG) list_handler.setFormatter(instance.logger.handlers[0].formatter) instance.logger.addHandler(list_handler) try: - instance.logger.debug("""client {} of client type {} issued instruction {} with message id {}. - starting execution""".format(client, client_type, instruction_str, msg_id)) + instance.logger.debug(f"client {client} of client type {client_type} issued instruction " + + f"{instruction_str} with message id {msg_id}. starting execution.") return_value = await cls.execute_once(instance_name, instance, instruction_str, arguments) #type: ignore - if not plain_reply: + if oneway: + await instance.message_broker.async_send_reply_with_message_type(instruction, b'ONEWAY', None) + continue + if fetch_execution_logs: return_value = { - "responseStatusCode" : 200, "returnValue" : return_value, - "state" : { - instance_name : instance.state() - } + "execution_logs" : list_handler.log_list } - if fetch_execution_logs: - return_value["logs"] = list_handler.log_list await instance.message_broker.async_send_reply(instruction, return_value) # Also catches exception in sending messages like serialization error except (BreakInnerLoop, BreakAllLoops): - instance.logger.info("Remote object {} with instance name {} exiting event loop.".format( + instance.logger.info("Thing {} with instance name {} exiting event loop.".format( instance.__class__.__name__, instance_name)) + if oneway: + await instance.message_broker.async_send_reply_with_message_type(instruction, b'ONEWAY', None) + continue return_value = None - if not plain_reply: + if fetch_execution_logs: return_value = { - "responseStatusCode" : 200, "returnValue" : None, - "state" : { - instance_name : instance.state() - } + "execution_logs" : list_handler.log_list } - if fetch_execution_logs: - return_value["logs"] = list_handler.log_list - await instance.message_broker.async_send_reply(instruction, return_value) - return - except Exception as E: - instance.logger.error("RemoteObject {} with instance name {} produced error : {}.".format( - instance.__class__.__name__, instance_name, E)) - return_value = { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None - } - if not plain_reply: - return_value = { - "responseStatusCode" : 500, - "exception" : return_value, - "state" : { - instance_name : instance.state() - } - } - if fetch_execution_logs: - return_value["logs"] = list_handler.log_list - await instance.message_broker.async_send_reply(instruction, return_value) - if fetch_execution_logs: - instance.logger.removeHandler(list_handler) + return + except Exception as ex: + instance.logger.error("Thing {} with instance name {} produced error : {}.".format( + instance.__class__.__name__, instance_name, ex)) + if oneway: + await instance.message_broker.async_send_reply_with_message_type(instruction, b'ONEWAY', None) + continue + return_value = dict(exception= format_exception_as_json(ex)) + if fetch_execution_logs: + return_value["execution_logs"] = list_handler.log_list + await instance.message_broker.async_send_reply_with_message_type(instruction, + b'EXCEPTION', return_value) + finally: + if fetch_execution_logs: + instance.logger.removeHandler(list_handler) @classmethod - async def execute_once(cls, instance_name : str, instance : RemoteObject, instruction_str : str, + async def execute_once(cls, instance_name : str, instance : Thing, instruction_str : str, arguments : typing.Dict[str, typing.Any]) -> typing.Dict[str, typing.Any]: - scadapy = instance.instance_resources[instruction_str] - if scadapy.iscallable: - if scadapy.state is None or (hasattr(instance, 'state_machine') and - instance.state_machine.current_state in scadapy.state): + resource = instance.instance_resources.get(instruction_str, None) + if resource is None: + raise AttributeError(f"unknown remote resource represented by instruction {instruction_str}") + if resource.isaction: + if resource.state is None or (hasattr(instance, 'state_machine') and + instance.state_machine.current_state in resource.state): # Note that because we actually find the resource within __prepare_instance__, its already bound # and we dont have to separately bind it. - func = scadapy.obj - if not scadapy.http_request_as_argument: - arguments.pop('request', None) - if scadapy.iscoroutine: - return await func(**arguments) + func = resource.obj + args = arguments.pop('__args__', tuple()) + if resource.iscoroutine: + return await func(*args, **arguments) else: - return func(**arguments) + return func(*args, **arguments) else: - raise StateMachineError("RemoteObject '{}' is in '{}' state, however command can be executed only in '{}' state".format( - instance_name, instance.state(), scadapy.state)) + raise StateMachineError("Thing '{}' is in '{}' state, however command can be executed only in '{}' state".format( + instance_name, instance.state, resource.state)) - elif scadapy.isparameter: + elif resource.isproperty: action = instruction_str.split('/')[-1] - parameter : RemoteParameter = scadapy.obj - owner_inst : RemoteSubobject = scadapy.bound_obj - if action == WRITE: - if scadapy.state is None or (hasattr(instance, 'state_machine') and - instance.state_machine.current_state in scadapy.state): - parameter.__set__(owner_inst, arguments["value"]) + prop = resource.obj # type: Property + owner_inst = resource.bound_obj # type: Thing + if action == "write": + if resource.state is None or (hasattr(instance, 'state_machine') and + instance.state_machine.current_state in resource.state): + return prop.__set__(owner_inst, arguments["value"]) else: - raise StateMachineError("RemoteObject {} is in `{}` state, however attribute can be written only in `{}` state".format( - instance_name, instance.state_machine.current_state, scadapy.state)) - return parameter.__get__(owner_inst, type(owner_inst)) - raise NotImplementedError("Unimplemented execution path for RemoteObject {} for instruction {}".format(instance_name, instruction_str)) + raise StateMachineError("Thing {} is in `{}` state, however attribute can be written only in `{}` state".format( + instance_name, instance.state_machine.current_state, resource.state)) + elif action == "read": + return prop.__get__(owner_inst, type(owner_inst)) + elif action == "delete": + return prop.deleter() # this may not be correct yet + raise NotImplementedError("Unimplemented execution path for Thing {} for instruction {}".format(instance_name, instruction_str)) def fork_empty_eventloop(instance_name : str, logfile : typing.Union[str, None] = None, python_command : str = 'python', condaenv : typing.Union[str, None] = None, prefix_command : typing.Union[str, None] = None): - command_str = '{}{}{}-c "from scadapy.server import EventLoop; E = EventLoop({}); E.run();"'.format( + command_str = '{}{}{}-c "from hololinked.server import EventLoop; E = EventLoop({}); E.run();"'.format( f'{prefix_command} ' if prefix_command is not None else '', f'call conda activate {condaenv} && ' if condaenv is not None else '', f'{python_command} ', @@ -272,11 +360,11 @@ def fork_empty_eventloop(instance_name : str, logfile : typing.Union[str, None] # class ForkedEventLoop: -# def __init__(self, instance_name : str, remote_objects : Union[RemoteObject, Consumer, List[Union[RemoteObject, Consumer]]], +# def __init__(self, instance_name : str, things : Union[Thing, Consumer, List[Union[Thing, Consumer]]], # log_level : int = logging.INFO, **kwargs): # self.subprocess = Process(target = forked_eventloop, kwargs = dict( # instance_name = instance_name, -# remote_objects = remote_objects, +# things = things, # log_level = log_level, # **kwargs # )) diff --git a/hololinked/server/events.py b/hololinked/server/events.py new file mode 100644 index 0000000..5710ed4 --- /dev/null +++ b/hololinked/server/events.py @@ -0,0 +1,105 @@ +import typing +import threading + +from ..param import Parameterized +from .zmq_message_brokers import EventPublisher +from .data_classes import ServerSentEvent + + + +class Event: + """ + Asynchronously push arbitrary messages to clients. Apart from default events created by the package (like state + change event, observable properties etc.), events are supposed to be created at class level or at ``__init__`` + as a instance attribute, otherwise their publishing socket is unbound and will lead to ``AttributeError``. + + Parameters + ---------- + name: str + name of the event, specified name may contain dashes and can be used on client side to subscribe to this event. + URL_path: str + URL path of the event if a HTTP server is used. only GET HTTP methods are supported. + """ + + def __init__(self, name : str, URL_path : typing.Optional[str] = None) -> None: + self.name = name + # self.name_bytes = bytes(name, encoding = 'utf-8') + if URL_path is not None and not URL_path.startswith('/'): + raise ValueError(f"URL_path should start with '/', please add '/' before '{URL_path}'") + self.URL_path = URL_path or '/' + name + self._unique_identifier = None # type: typing.Optional[str] + self._owner = None # type: typing.Optional[Parameterized] + self._remote_info = None # type: typing.Optional[ServerSentEvent] + self._publisher = None + # above two attributes are not really optional, they are set later. + + @property + def owner(self): + """ + Event owning ``Thing`` object. + """ + return self._owner + + @property + def publisher(self) -> "EventPublisher": + """ + Event publishing PUB socket owning object. + """ + return self._publisher + + @publisher.setter + def publisher(self, value : "EventPublisher") -> None: + if not self._publisher: + self._publisher = value + self._publisher.register(self) + else: + raise AttributeError("cannot reassign publisher attribute of event {}".format(self.name)) + + def push(self, data : typing.Any = None, *, serialize : bool = True, **kwargs) -> None: + """ + publish the event. + + Parameters + ---------- + data: Any + payload of the event + serialize: bool, default True + serialize the payload before pushing, set to False when supplying raw bytes + **kwargs: + rpc_clients: bool, default True + pushes event to RPC clients, irrelevant if ``Thing`` uses only one type of serializer (refer to + difference between rpc_serializer and json_serializer). + http_clients: bool, default True + pushed event to HTTP clients, irrelevant if ``Thing`` uses only one type of serializer (refer to + difference between rpc_serializer and json_serializer). + """ + self.publisher.publish(self._unique_identifier, data, rpc_clients=kwargs.get('rpc_clients', True), + http_clients=kwargs.get('http_clients', True), serialize=serialize) + + +class CriticalEvent(Event): + """ + Push events to client and get acknowledgement for that + """ + + def __init__(self, name : str, URL_path : typing.Optional[str] = None) -> None: + super().__init__(name, URL_path) + self._synchronize_event = threading.Event() + + def receive_acknowledgement(self, timeout : typing.Union[float, int, None]) -> bool: + """ + Receive acknowlegement for event receive. When the timeout argument is present and not None, + it should be a floating point number specifying a timeout for the operation in seconds (or fractions thereof). + """ + return self._synchronize_event.wait(timeout=timeout) + + def _set_acknowledgement(self): + """ + Method to be called by RPC server when an acknowledgement is received. Not for user to be set. + """ + self._synchronize_event.set() + + +__all__ = [ + Event.__name__, +] \ No newline at end of file diff --git a/hololinked/server/exceptions.py b/hololinked/server/exceptions.py index 7deb3db..96a460c 100644 --- a/hololinked/server/exceptions.py +++ b/hololinked/server/exceptions.py @@ -12,9 +12,15 @@ class BreakAllLoops(Exception): class StateMachineError(Exception): """ - raise to show errors while calling methods or writing parameters in wrong state + raise to show errors while calling actions or writing properties in wrong state """ pass +class DatabaseError(Exception): + """ + raise to show database related errors + """ + + __all__ = ['BreakInnerLoop', 'BreakAllLoops', 'StateMachineError'] \ No newline at end of file diff --git a/hololinked/server/handlers.py b/hololinked/server/handlers.py index 1f45cfc..9caa22f 100644 --- a/hololinked/server/handlers.py +++ b/hololinked/server/handlers.py @@ -1,410 +1,398 @@ -# routing ideas from https://www.tornadoweb.org/en/branch6.3/routing.html -import traceback +import asyncio +import zmq.asyncio import typing -from typing import List, Dict, Any, Union, Callable, Tuple -from types import FunctionType +import uuid from tornado.web import RequestHandler, StaticFileHandler from tornado.iostream import StreamClosedError -from tornado.httputil import HTTPServerRequest -from time import perf_counter -from .constants import (IMAGE_STREAM, EVENT, CALLABLE, ATTRIBUTE, FILE) -from .serializers import JSONSerializer -from .path_converter import extract_path_parameters -from .zmq_message_brokers import MessageMappedZMQClientPool, EventConsumer -from .webserver_utils import log_request -from .remote_object import RemoteObject -from .eventloop import EventLoop -from .utils import current_datetime_ms_str -from .data_classes import (HTTPServerResourceData, HTTPServerEventData, - HTTPServerResourceData, FileServerData) +from .data_classes import HTTPResource, ServerSentEvent +from .utils import * +from .zmq_message_brokers import AsyncEventConsumer, EventConsumer -UnknownHTTPServerData = HTTPServerResourceData( - what = 'unknown', - instance_name = 'unknown', - fullpath='unknown', - instruction = 'unknown' -) +class BaseHandler(RequestHandler): + """ + Base request handler for RPC operations + """ + def initialize(self, resource : typing.Union[HTTPResource, ServerSentEvent], owner = None) -> None: + """ + Parameters + ---------- + resource: HTTPResource | ServerSentEvent + resource representation of Thing's exposed object using a dataclass + owner: HTTPServer + owner ``hololinked.server.HTTPServer.HTTPServer`` instance + """ + from .HTTPServer import HTTPServer + assert isinstance(owner, HTTPServer) + self.resource = resource + self.owner = owner + self.zmq_client_pool = self.owner.zmq_client_pool + self.serializer = self.owner.serializer + self.logger = self.owner.logger + self.allowed_clients = self.owner.allowed_clients + + def set_headers(self) -> None: + """ + override this to set custom headers without having to reimplement entire handler + """ + raise NotImplementedError("implement set headers in child class to automatically call it" + + " after directing the request to Thing") + + def get_execution_parameters(self) -> typing.Tuple[typing.Dict[str, typing.Any], + typing.Dict[str, typing.Any], typing.Union[float, int, None]]: + """ + merges all arguments to a single JSON body and retrieves execution context (like oneway calls, fetching executing + logs) and timeouts + """ + if len(self.request.body) > 0: + arguments = self.serializer.loads(self.request.body) + else: + arguments = dict() + if isinstance(arguments, dict): + if len(self.request.query_arguments) >= 1: + for key, value in self.request.query_arguments.items(): + if len(value) == 1: + arguments[key] = self.serializer.loads(value[0]) + else: + arguments[key] = [self.serializer.loads(val) for val in value] + context = dict(fetch_execution_logs=arguments.pop('fetch_execution_logs', False)) + timeout = arguments.pop('timeout', None) + if timeout is not None and timeout < 0: + timeout = None + if self.resource.request_as_argument: + arguments['request'] = self.request + return arguments, context, timeout + return arguments, dict(), 5 # arguments, context is empty, 5 seconds invokation timeout, hardcoded needs to be fixed + + @property + def has_access_control(self) -> bool: + """ + Checks if a client is an allowed client. Requests from un-allowed clients are reject without execution. Custom + web handlers can use this property to check if a client has access control on the server or ``Thing``. + """ + if len(self.allowed_clients) == 0: + self.set_header("Access-Control-Allow-Origin", "*") + return True + # For credential login, access control allow origin cannot be '*', + # See: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#examples_of_access_control_scenarios + origin = self.request.headers.get("Origin") + if origin is not None and (origin in self.allowed_clients or origin + '/' in self.allowed_clients): + self.set_header("Access-Control-Allow-Origin", origin) + return True + return False + + def set_access_control_allow_headers(self) -> None: + """ + For credential login, access control allow headers cannot be a wildcard '*'. + Some requests require exact list of allowed headers for the client to access the response. + Use this method in set_headers() override if necessary. + """ + headers = ", ".join(self.request.headers.keys()) + if self.request.headers.get("Access-Control-Request-Headers", None): + headers += ", " + self.request.headers["Access-Control-Request-Headers"] + self.set_header("Access-Control-Allow-Headers", headers) -class FileHandlerResource(StaticFileHandler): - @classmethod - def get_absolute_path(cls, root: str, path: str) -> str: - """Returns the absolute location of ``path`` relative to ``root``. - ``root`` is the path configured for this `StaticFileHandler` - (in most cases the ``static_path`` `Application` setting). +class RPCHandler(BaseHandler): + """ + Handler for property read-write and method calls + """ - This class method may be overridden in subclasses. By default - it returns a filesystem path, but other strings may be used - as long as they are unique and understood by the subclass's - overridden `get_content`. + async def get(self) -> None: + """ + runs property or action if accessible by 'GET' method. Default for property reads. + """ + await self.handle_through_thing('GET') - .. versionadded:: 3.1 + async def post(self) -> None: """ - return root+path + runs property or action if accessible by 'POST' method. Default for action execution. + """ + await self.handle_through_thing('POST') + + async def patch(self) -> None: + """ + runs property or action if accessible by 'PATCH' method. + """ + await self.handle_through_thing('PATCH') + + async def put(self) -> None: + """ + runs property or action if accessible by 'PUT' method. Default for property writes. + """ + await self.handle_through_thing('PUT') + async def delete(self) -> None: + """ + runs property or action if accessible by 'DELETE' method. Default for property deletes. + """ + await self.handle_through_thing('DELETE') + + def set_headers(self) -> None: + """ + sets default headers for RPC (property read-write and action execution). The general headers are listed as follows: + .. code-block:: http -class BaseRequestHandler(RequestHandler): - """ - Defines functions common to Request Handlers - """ - zmq_client_pool : Union[MessageMappedZMQClientPool, None] = None - json_serializer : JSONSerializer - resources : Dict[str, Dict[str, Union[HTTPServerResourceData, HTTPServerEventData, - FileServerData]]] - own_resources : dict - local_objects : Dict[str, RemoteObject] - - def initialize(self, client_address : str, start_time : float) -> None: - self.client_address = client_address - self.start_time = start_time - - async def handle_client(self) -> None: - pass - - def prepare_arguments(self, path_arguments : typing.Optional[typing.Dict] = None) -> Dict[str, Any]: - arguments = {} - for key, value in self.request.query_arguments.items(): - if len(value) == 1: - arguments[key] = self.json_serializer.loads(value[0]) - else: - arguments[key] = [self.json_serializer.loads(val) for val in value] - if len(self.request.body) > 0: - arguments.update(self.json_serializer.loads(self.request.body)) - if path_arguments is not None: - arguments.update(path_arguments) - print(arguments) - return arguments - - async def handle_func(self, instruction : Tuple[Callable, bool], arguments): - func, iscoroutine = instruction, instruction.scada_info.iscoroutine - if iscoroutine: - return self.json_serializer.dumps({ - "responseStatusCode" : 200, - "returnValue" : await func(**arguments) - }) + Content-Type: application/json + Access-Control-Allow-Credentials: true + Access-Control-Allow-Origin: + """ + self.set_header("Content-Type" , "application/json") + self.set_header("Access-Control-Allow-Credentials", "true") + + async def options(self) -> None: + """ + Options for the resource. Main functionality is to inform the client is a specific HTTP method is supported by + the property or the action (Access-Control-Allow-Methods). + """ + if self.has_access_control: + self.set_status(204) + self.set_access_control_allow_headers() + self.set_header("Access-Control-Allow-Credentials", "true") + self.set_header("Access-Control-Allow-Methods", ', '.join(self.resource.instructions.supported_methods())) else: - return self.json_serializer.dumps({ - "responseStatusCode" : 200, - "returnValue" : func(**arguments) - }) - - async def handle_bound_method(self, info : HTTPServerResourceData, arguments): - instance = self.local_objects[info.instance_name] - return self.json_serializer.dumps({ - "responseStatusCode" : 200, - "returnValue" : await EventLoop.execute_once(info.instance_name, instance, - info.instruction, arguments), - "state" : { - info.instance_name : instance.state_machine.current_state if instance.state_machine is not None else None - } - }) - - async def handle_instruction(self, info : HTTPServerResourceData, path_arguments : typing.Optional[typing.Dict] = None) -> None: - self.set_status(202) - self.add_header("Access-Control-Allow-Origin", self.client_address) - self.set_header("Content-Type" , "application/json") - try: - arguments = self.prepare_arguments(path_arguments) - context = dict(fetch_execution_logs = arguments.pop('fetch_execution_logs', False)) - timeout = arguments.pop('timeout', 3) - if info.http_request_as_argument: - arguments['request'] = self.request - if isinstance(info.instruction, FunctionType): - reply = await self.handle_func(info.instruction, arguments) # type: ignore - elif info.instance_name in self.local_objects: - reply = await self.handle_bound_method(info, arguments) - elif self.zmq_client_pool is None: - raise AttributeError("wrong resource finding logic - contact developer.") - else: - # let the body be decoded on the remote object side - reply = await self.zmq_client_pool.async_execute(info.instance_name, info.instruction, arguments, - context, False, timeout) # type: ignore - except Exception as E: - reply = self.json_serializer.dumps({ - "responseStatusCode" : 500, - "exception" : { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None # type: ignore - } - }) - if reply: - self.write(reply) - self.add_header("Execution-Time", f"{((perf_counter()-self.start_time)*1000.0):.4f}") + self.set_status(401, "forbidden") self.finish() + - async def handled_through_remote_object(self, info : HTTPServerRequest) -> bool: - data = self.resources["STATIC_ROUTES"].get(self.request.path, UnknownHTTPServerData) - if data.what == CALLABLE or data.what == ATTRIBUTE: - # Cannot use 'is' operator like 'if what is CALLABLE' because the remote object from which - # CALLABLE string was fetched is in another process - await self.handle_instruction(data) # type: ignore - return True - if data.what == EVENT: - return False - - for route in self.resources["DYNAMIC_ROUTES"].values(): - arguments = extract_path_parameters(self.request.path, route.path_regex, route.param_convertors) - if arguments and route.what != FILE: - await self.handle_instruction(route, arguments) - return True - return False - - async def handled_as_datastream(self, request : HTTPServerRequest) -> bool: - event_info = self.resources["STATIC_ROUTES"].get(self.request.path, UnknownHTTPServerData) - if event_info.what == EVENT: + async def handle_through_thing(self, http_method : str) -> None: + """ + handles the actual RPC call, called by each of the HTTP methods with their name as the argument. + """ + if not self.has_access_control: + self.set_status(401, "forbidden") + elif http_method not in self.resource.instructions: + self.set_status(404, "not found") + else: + reply = None try: - event_consumer = EventConsumer(request.path, event_info.socket_address, - f"{request.path}_HTTPEvent@"+current_datetime_ms_str()) - except Exception as E: - reply = self.json_serializer.dumps({ - "responseStatusCode" : 500, - "exception" : { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None # type: ignore - } - }) - self.set_status(404) - self.add_header('Access-Control-Allow-Origin', self.client_address) - self.add_header('Content-Type' , 'application/json') + arguments, context, timeout = self.get_execution_parameters() + reply = await self.zmq_client_pool.async_execute( + instance_name=self.resource.instance_name, + instruction=self.resource.instructions.__dict__[http_method], + arguments=arguments, + context=context, + raise_client_side_exception=False, + invokation_timeout=timeout, + execution_timeout=None, + argument_schema=self.resource.argument_schema + ) # type: ignore + # message mapped client pool currently strips the data part from return message + # and provides that as reply directly + self.set_status(200, "ok") + except ConnectionAbortedError as ex: + self.set_status(503, str(ex)) + event_loop = asyncio.get_event_loop() + event_loop.call_soon(lambda : asyncio.create_task(self.owner.update_router_with_thing( + self.zmq_client_pool[self.resource.instance_name]))) + except ConnectionError as ex: + await self.owner.update_router_with_thing(self.zmq_client_pool[self.resource.instance_name]) + await self.handle_through_thing(http_method) # reschedule + return + except Exception as ex: + self.logger.error(f"error while scheduling RPC call - {str(ex)}") + self.logger.debug(f"traceback - {ex.__traceback__}") + self.set_status(500, "error while scheduling RPC call") + reply = self.serializer.dumps({"exception" : format_exception_as_json(ex)}) + self.set_headers() + if reply: self.write(reply) - self.finish() - return True - - self.set_status(200) - self.set_header('Access-Control-Allow-Origin', self.client_address) - self.set_header("Content-Type", "text/event-stream") - self.set_header("Cache-Control", "no-cache") - self.set_header("Connection", "keep-alive") - # Need to check if this helps as events with HTTP alone is not reliable - # self.set_header("Content-Security-Policy", "connect-src 'self' http://localhost:8085;") - data_header = b'data: %s\n\n' - while True: - try: - data = await event_consumer.receive_event() - if data: - # already JSON serialized - print(f"data sent") - self.write(data_header % data) - await self.flush() - except StreamClosedError: - break - except Exception as E: - print({ - "responseStatusCode" : 500, - "exception" : { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None # type: ignore - }}) - self.finish() - event_consumer.exit() - return True - return False + self.finish() + - async def handled_as_imagestream(self, request : HTTPServerRequest) -> bool: - event_info = self.resources["STATIC_ROUTES"].get(self.request.path, UnknownHTTPServerData) - if event_info.what == IMAGE_STREAM: - try: - event_consumer = EventConsumer(request.path, event_info.socket_address, - f"{request.path}_HTTPEvent@"+current_datetime_ms_str()) - except Exception as E: - reply = self.json_serializer.dumps({ - "responseStatusCode" : 500, - "exception" : { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None # type: ignore - } - }) - self.set_status(404) - self.add_header('Access-Control-Allow-Origin', self.client_address) - self.add_header('Content-Type' , 'application/json') - self.write(reply) - self.finish() - return True - + +class EventHandler(BaseHandler): + """ + handles events emitted by ``Thing`` and tunnels them as HTTP SSE. + """ + + def set_headers(self) -> None: + """ + sets default headers for event handling. The general headers are listed as follows: + + .. code-block:: http + + Content-Type: text/event-stream + Cache-Control: no-cache + Connection: keep-alive + Access-Control-Allow-Credentials: true + Access-Control-Allow-Origin: + """ + self.set_header("Content-Type", "text/event-stream") + self.set_header("Cache-Control", "no-cache") + self.set_header("Connection", "keep-alive") + self.set_header("Access-Control-Allow-Credentials", "true") + + async def get(self): + """ + events are support only with GET method. + """ + if self.has_access_control: + self.set_headers() + await self.handle_datastream() + else: + self.set_status(401, "forbidden") + self.finish() + + async def options(self): + """ + options for the resource. + """ + if self.has_access_control: + self.set_status(204) + self.set_access_control_allow_headers() + self.set_header("Access-Control-Allow-Credentials", "true") + self.set_header("Access-Control-Allow-Methods", 'GET') + else: + self.set_status(401, "forbidden") + self.finish() + + def receive_blocking_event(self, event_consumer : EventConsumer): + return event_consumer.receive(timeout=10000, deserialize=False) + + async def handle_datastream(self) -> None: + """ + called by GET method and handles the event. + """ + try: + event_consumer_cls = EventConsumer if hasattr(self.owner, '_zmq_event_context') else AsyncEventConsumer + # synchronous context with INPROC pub or asynchronous context with IPC or TCP pub, we handle both in async + # fashion as HTTP server should be running purely sync(or normal) python method. + event_consumer = event_consumer_cls(self.resource.unique_identifier, self.resource.socket_address, + identity=f"{self.resource.unique_identifier}|HTTPEvent|{uuid.uuid4()}", + logger=self.logger, json_serializer=self.serializer, + context=self.owner._zmq_event_context if self.resource.socket_address.startswith('inproc') else None) + event_loop = asyncio.get_event_loop() + data_header = b'data: %s\n\n' self.set_status(200) - self.set_header('Access-Control-Allow-Origin', self.client_address) + except Exception as ex: + self.logger.error(f"error while subscribing to event - {str(ex)}") + self.set_status(500, "could not subscribe to event source from thing") + self.write(self.serializer.dumps({"exception" : format_exception_as_json(ex)})) + return + + while True: + try: + if isinstance(event_consumer, AsyncEventConsumer): + data = await event_consumer.receive(timeout=10000, deserialize=False) + else: + data = await event_loop.run_in_executor(None, self.receive_blocking_event, event_consumer) + if data: + # already JSON serialized + self.write(data_header % data) + await self.flush() + self.logger.debug(f"new data sent - {self.resource.name}") + else: + self.logger.debug(f"found no new data") + except StreamClosedError: + break + except Exception as ex: + self.logger.error(f"error while pushing event - {str(ex)}") + self.write(data_header % self.serializer.dumps( + {"exception" : format_exception_as_json(ex)})) + try: + if isinstance(self.owner._zmq_event_context, zmq.asyncio.Context): + event_consumer.exit() + except Exception as ex: + self.logger.error(f"error while closing event consumer - {str(ex)}" ) + + +class ImageEventHandler(EventHandler): + """ + handles events with images with image data header + """ + + async def handle_datastream(self) -> None: + try: + event_consumer = AsyncEventConsumer(self.resource.unique_identifier, self.resource.socket_address, + f"{self.resource.unique_identifier}|HTTPEvent|{uuid.uuid4()}", + json_serializer=self.serializer, logger=self.logger, + context=self.owner._zmq_event_context if self.resource.socket_address.startswith('inproc') else None) self.set_header("Content-Type", "application/x-mpegURL") - self.set_header("Cache-Control", "no-cache") - self.set_header("Connection", "keep-alive") self.write("#EXTM3U\n") - # Need to check if this helps as events with HTTP alone is not reliable delimiter = "#EXTINF:{},\n" data_header = b'data:image/jpeg;base64,%s\n' while True: try: - data = await event_consumer.receive_event() + data = await event_consumer.receive(timeout=10000, deserialize=False) if data: - # already JSON serialized + # already serialized self.write(delimiter) self.write(data_header % data) - print(f"image data sent {data[0:100]}") await self.flush() + self.logger.debug(f"new image sent - {self.resource.name}") + else: + self.logger.debug(f"found no new data") except StreamClosedError: break - except Exception as E: - print({ - "responseStatusCode" : 500, - "exception" : { - "message" : str(E), - "type" : repr(E).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : E.__notes__ if hasattr(E, "__notes__") else None # type: ignore - }}) - self.finish() + except Exception as ex: + self.logger.error(f"error while pushing event - {str(ex)}") + self.write(data_header % self.serializer.dumps( + {"exception" : format_exception_as_json(ex)})) event_consumer.exit() - return True - return False - - - - -class GetResource(BaseRequestHandler): + except Exception as ex: + self.write(data_header % self.serializer.dumps( + {"exception" : format_exception_as_json(ex)})) + - async def handled_as_filestream(self, request : HTTPServerRequest) -> bool: - """this method is wrong and does not work""" - for route in self.resources["DYNAMIC_ROUTES"].values(): - arguments = extract_path_parameters(self.request.path, route.path_regex, route.param_convertors) - if arguments and route.what == FILE: - file_handler = FileHandler(self.application, request, path=route.directory) - # file_handler.initialize(data.directory) # type: ignore - await file_handler.get(arguments["filename"]) - return True - return False - async def get(self): - # log_request(self.request) - if (await self.handled_through_remote_object(self.request)): - return - - elif (await self.handled_as_datastream(self.request)): - return - - elif (await self.handled_as_imagestream(self.request)): - return - - elif self.request.path in self.own_resources: - func = self.own_resources[self.request.path] - body = self.prepare_arguments() - func(self, body) - return - - self.set_status(404) - self.add_header("Access-Control-Allow-Origin", self.client_address) - self.finish() - - def paths(self, body): - self.set_status(200) - self.add_header("Access-Control-Allow-Origin", self.client_address) - self.add_header("Content-Type" , "application/json") - resources = dict( - GET = { - "STATIC_ROUTES" : GetResource.resources["STATIC_ROUTES"].keys(), - "DYNAMIC_ROUTES" : GetResource.resources["DYNAMIC_ROUTES"].keys() - }, - POST = { - "STATIC_ROUTES" : PostResource.resources["STATIC_ROUTES"].keys(), - "DYNAMIC_ROUTES" : PostResource.resources["DYNAMIC_ROUTES"].keys() - }, - PUT = { - "STATIC_ROUTES" : PutResource.resources["STATIC_ROUTES"].keys(), - "DYNAMIC_ROUTES" : PutResource.resources["DYNAMIC_ROUTES"].keys() - }, - DELETE = { - "STATIC_ROUTES" : DeleteResource.resources["STATIC_ROUTES"].keys(), - "DYNAMIC_ROUTES" : DeleteResource.resources["DYNAMIC_ROUTES"].keys() - }, - OPTIONS = { - "STATIC_ROUTES" : OptionsResource.resources["STATIC_ROUTES"].keys(), - "DYNAMIC_ROUTES" : OptionsResource.resources["DYNAMIC_ROUTES"].keys() - }, - ) - self.write(self.json_serializer.dumps(resources)) - self.finish() - - own_resources = { - '/paths' : paths, - } +class FileHandler(StaticFileHandler): + @classmethod + def get_absolute_path(cls, root: str, path: str) -> str: + """ + Returns the absolute location of ``path`` relative to ``root``. - -class PostResource(BaseRequestHandler): - - async def post(self): - # log_request(self.request) - if (await self.handled_through_remote_object(self.request)): - return - - # elif self.request.path in self.own_resources: - # func = self.own_resources[self.request.path] - # body = self.decode_body(self.request.body) - # func(self, body) - # return + ``root`` is the path configured for this `StaticFileHandler` + (in most cases the ``static_path`` `Application` setting). - self.set_status(404) - self.add_header("Access-Control-Allow-Origin", self.client_address) - self.finish() + This class method may be overridden in subclasses. By default + it returns a filesystem path, but other strings may be used + as long as they are unique and understood by the subclass's + overridden `get_content`. + .. versionadded:: 3.1 + """ + return root+path -class PutResource(BaseRequestHandler): - - async def put(self): - if (await self.handled_through_remote_object(self.request)): - return +class ThingsHandler(BaseHandler): + """ + add or remove things + """ + async def get(self): self.set_status(404) - self.add_header("Access-Control-Allow-Origin", self.client_address) self.finish() - - -class PatchResource(BaseRequestHandler): - async def patch(self): - - if (await self.handled_through_remote_object(self.request)): - return - - self.set_status(404) - self.add_header("Access-Control-Allow-Origin", self.client_address) + async def post(self): + if not self.has_access_control: + self.set_status(401, 'forbidden') + else: + try: + instance_name = "" + await self.zmq_client_pool.create_new(server_instance_name=instance_name) + await self.owner.update_router_with_thing(self.zmq_client_pool[instance_name]) + self.set_status(204, "ok") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_headers() self.finish() - - -class DeleteResource(BaseRequestHandler): - - async def delete(self): - - if (await self.handled_through_remote_object(self.request)): - return - - self.set_status(404) - self.add_header("Access-Control-Allow-Origin", self.client_address) + async def options(self): + if self.has_access_control: + self.set_status(204) + self.set_access_control_allow_headers() + self.set_header("Access-Control-Allow-Credentials", "true") + self.set_header("Access-Control-Allow-Methods", 'GET, POST') + else: + self.set_status(401, "forbidden") self.finish() -class OptionsResource(BaseRequestHandler): - """ - this is wrong philosophically - """ - - async def options(self): - self.set_status(204) - self.set_header("Access-Control-Allow-Origin", "*") - self.set_header("Access-Control-Allow-Headers", "*") - self.set_header("Access-Control-Allow-Methods", 'GET, POST, PUT, DELETE, OPTIONS') - self.finish() diff --git a/hololinked/server/host_server.py b/hololinked/server/host_server.py deleted file mode 100644 index 5cfaa17..0000000 --- a/hololinked/server/host_server.py +++ /dev/null @@ -1,79 +0,0 @@ -from sqlalchemy_utils import create_database, database_exists -import logging - -from ..param.parameters import String, TypedList -from .HTTPServer import HTTPServer -from .eventloop import Consumer -from .host_utilities import (ReactClientUtilities, PrimaryHostUtilities, create_client_tables, create_server_tables, - SERVER_INSTANCE_NAME, CLIENT_HOST_INSTANCE_NAME) -from .database import create_DB_URL - - - -class PCHostServer(HTTPServer): - - consumers = TypedList(item_type=Consumer, default=None, allow_None=True, - doc="""Remote Object to be directly served within the HTTP server""") - - db_config_file = String(default='') - - def __init__(self, *, port = 8080, address = '0.0.0.0', log_level = logging.INFO, - db_config_file = 'host_db_config.json', json_serializer = None, protocol_version = 2, **kwargs): - super().__init__( - consumers = None, - port = port, - address = address, - logger = kwargs.get('logger', None), - log_level = log_level, - json_serializer = json_serializer, - protocol_version = protocol_version - ) - - - -class PrimaryHostServer(HTTPServer): - - consumers = TypedList(item_type=Consumer, default=None, allow_None=True, - doc="""Remote Object to be directly served within the HTTP server""") - - db_config_file = String(default='') - - def __init__(self, *, port = 8080, address = '0.0.0.0', log_level = logging.INFO, - db_config_file = 'host_db_config.json', json_serializer = None, protocol_version = 2, **kwargs): - super().__init__( - consumers = None, - port = port, - address = address, - logger = kwargs.get('logger', None), - log_level = log_level, - json_serializer = json_serializer, - protocol_version = protocol_version - ) - self.db_config_file = db_config_file - self.create_databases() - - @property - def all_ok(self, boolean=False): - super().all_ok - react_client_utilities = Consumer(ReactClientUtilities, db_config_file = self.db_config_file, - instance_name = CLIENT_HOST_INSTANCE_NAME, logger = self.logger) - server_side_utilities = Consumer(PrimaryHostUtilities, db_config_file = self.db_config_file, - instance_name = SERVER_INSTANCE_NAME, server_network_interface = 'Wi-Fi', - port = self.port, logger = self.logger) - self.consumers = [react_client_utilities, server_side_utilities] - return True - - def create_databases(self): - URL = create_DB_URL(self.db_config_file) - serverDB = f"{URL}/scadapyserver" - if not database_exists(serverDB): - create_database(serverDB) - create_server_tables(serverDB) - clientDB = f"{URL}/scadapyclient" - if not database_exists(clientDB): - create_database(clientDB) - create_client_tables(clientDB) - - - -__all__ = ['PCHostServer', 'PrimaryHostServer'] \ No newline at end of file diff --git a/hololinked/server/host_utilities.py b/hololinked/server/host_utilities.py deleted file mode 100644 index 63e5a0a..0000000 --- a/hololinked/server/host_utilities.py +++ /dev/null @@ -1,500 +0,0 @@ -import socket -import json -import asyncio -import typing -from dataclasses import dataclass, asdict, field - -from sqlalchemy import Integer, String, JSON, ARRAY, Boolean -from sqlalchemy import select, create_engine -from sqlalchemy.orm import Session -from sqlalchemy.orm import Mapped, mapped_column, DeclarativeBase -from argon2 import PasswordHasher -from tornado.httputil import HTTPServerRequest -from tornado.httpclient import AsyncHTTPClient, HTTPRequest - -from .serializers import JSONSerializer -from .remote_parameters import TypedList -from .zmq_message_brokers import MessageMappedZMQClientPool -from .webserver_utils import get_IP_from_interface, update_resources_using_client -from .utils import unique_id -from .decorators import post, get, put, delete -from .eventloop import Consumer, EventLoop, fork_empty_eventloop -from .remote_object import RemoteObject, RemoteObjectDB, RemoteObjectMetaclass -from .database import BaseAsyncDB - - -SERVER_INSTANCE_NAME = 'server-util' -CLIENT_HOST_INSTANCE_NAME = 'dashboard-util' - - -# /* -# We want to be able to do the following - -# 1) Add and remove server - -# 2) Each server -# - can be designated as host -# - allows DB operations specific to the client -# - only one such server can exist -# - or as normal instrument server -# - create new eventloop -# - create new device -# - create new HTTP servers -# - raw input output -# - have GUI JSON -# */ - - -class ReactClientUtilities(BaseAsyncDB, RemoteObject): - - class TableBase(DeclarativeBase): - pass - - class dashboards(TableBase): - __tablename__ = "dashboards" - - name : Mapped[str] = mapped_column(String(1024), primary_key = True) - URL : Mapped[str] = mapped_column(String(1024), unique = True) - description : Mapped[str] = mapped_column(String(16384)) - - def json(self): - return { - "name" : self.name, - "URL" : self.URL, - "description" : self.description - } - - class appsettings(TableBase): - __tablename__ = "appsettings" - - field : Mapped[str] = mapped_column(String(8192), primary_key = True) - value : Mapped[typing.Dict[str, typing.Any]] = mapped_column(JSON) - - def json(self): - return { - "field" : self.field, - "value" : self.value - } - - class login_credentials(TableBase): - __tablename__ = "login_credentials" - - username : Mapped[str] = mapped_column(String(1024), primary_key = True) - password : Mapped[str] = mapped_column(String(1024), unique = True) - - def __init__(self, db_config_file : str, **kwargs) -> None: - RemoteObject.__init__(self, **kwargs) - BaseAsyncDB.__init__(self, database='scadapyclient', serializer=self.json_serializer, - config_file=db_config_file) - - @post('/user/add') - async def add_user(self, username : str, password : str): - pass - - @post('/login') - async def login(self, username : str, password : str): - async with self.async_session() as session: - ph = PasswordHasher(time_cost = 500, memory_cost = 2) - stmt = select(self.login_credentials).filter_by(username = username) - data = await session.execute(stmt) - if data["password"] == ph.hash(password): - return True - return False - - @post("/app/settings/new") - async def create_app_setting(self, field : str, value : typing.Any): - async with self.async_session() as session, session.begin(): - session.add(self.appsettings( - field = field, - value = {"value" : value} - ) - ) - session.commit() - - @post("/app/settings/edit") - async def edit_app_setting(self, field : str, value : typing.Any): - async with self.async_session() as session, session.begin(): - stmt = select(self.appsettings).filter_by(field = field) - data = await session.execute(stmt) - setting = data.scalar() - setting.value = {"value" : value} - session.commit() - return setting - - @get('/app/settings/all') - async def all_app_settings(self): - async with self.async_session() as session: - stmt = select(self.appsettings) - data = await session.execute(stmt) - return {result[self.appsettings.__name__].field : result[self.appsettings.__name__].value["value"] - for result in data.mappings().all()} - - @get('/app/info/all') - async def all_app_settings(self): - async with self.async_session() as session: - stmt = select(self.appsettings) - data = await session.execute(stmt) - return { - "appsettings" : {result[self.appsettings.__name__].field : result[self.appsettings.__name__].value["value"] - for result in data.mappings().all()} - } - - @post('/dashboards/add') - async def add_dashboards(self, name : str, URL : str, description : str): - async with self.async_session() as session, session.begin(): - session.add(self.dashboards( - name = name, - URL = URL, - description = description - )) - await session.commit() - - @get('/dashboards/list') - async def query_pages(self): - async with self.async_session() as session: - stmt = select(self.dashboards) - data = await session.execute(stmt) - return [result[self.dashboards.__name__] for result in data.mappings().all()] - - - -@dataclass -class NonDBRemoteObjectInfo: - instance_name : str - classname : str - script : str - - def json(self): - return asdict(self) - - -@dataclass -class SubscribedHTTPServers: - hostname : str - IPAddress : typing.Any - port : int - type : str - https : bool - qualifiedIP : str = field(init = False) - - def __post_init__(self): - self.qualifiedIP = '{}:{}'.format(self.hostname, self.port) - - def json(self): - return asdict(self) - - -@dataclass -class UninstantiatedRemoteObject: - consumer : RemoteObjectMetaclass - file_name : str - object_name : str - eventloop_name : str - id : str - - def json(self): - return dict ( - id = self.id, - file_name = self.file_name, - object_name = self.object_name, - eventloop_name = self.eventloop_name - ) - - -class HTTPServerUtilities(BaseAsyncDB, RemoteObject): - """ - HTTPServerUtilities provide functionality to instantiate, kill or get current status of - existing remote-objects attached to this server, ability to subscribe to a Primary Host Server - which brings remote-objects to the web UI & spawn new event loops - """ - - type : str = 'NORMAL_REMOTE_OBJECT_SERVER' - - remote_object_info = TypedList(default=None, allow_None=True, item_type=(RemoteObjectDB.RemoteObjectInfo), - URL_path='/remote-object-info') - - def __init__(self, db_config_file : typing.Union[str, None], zmq_client_pool : MessageMappedZMQClientPool, - remote_object_info , **kwargs) -> None: - RemoteObject.__init__(self, **kwargs) - BaseAsyncDB.__init__(self, database='scadapyserver', serializer=self.json_serializer, config_file=db_config_file) - self.zmq_client_pool = zmq_client_pool - self.server_resources = None - self.remote_object_info = remote_object_info - self._uninstantiated_remote_objects : typing.Dict[str, UninstantiatedRemoteObject] = {} - - @post('/subscribe') - async def subscribe_to_host(self, host : str, port : int): - client = AsyncHTTPClient() - try: - R = await client.fetch(HTTPRequest( - url = "{}/{}/{}".format(host, SERVER_INSTANCE_NAME, 'subscription'), - method = 'POST', - body = JSONSerializer.general_dumps(dict( - hostname=socket.gethostname(), - port=port, - type=self.type, - https=True - )) - )) - except Exception as E: - self.logger.error(f"Could not subscribe to host {host}. error : {str(E)}, error type : {type(E)}.") - raise - if R.code == 200 or R.code == 202: - self.logger.info(f"subsribed successfully to host {host}") - else: - raise RuntimeError(f"could not subsribe to host {host}. response {json.loads(R.body)}") - # we lose the client anyway so we close it. if we decide to reuse the client, changes needed - client.close() - - @post('/eventloop/new') - def new_eventloop(self, instance_name, proxy_serializer, json_serializer): - fork_empty_eventloop(instance_name = instance_name) - self.zmq_client_pool.register_client(instance_name) - self.zmq_client_pool[instance_name].handshake() - - @post('/remote-object/import') - async def import_remote_object(self, file_name : str, object_name : str, eventloop_name : str): - consumer = EventLoop.import_remote_object(file_name, object_name) - id = unique_id().decode() - db_params = consumer.parameters.remote_objects_webgui_info(consumer.parameters.load_at_init_objects()) - self._uninstantiated_remote_objects[id] = UninstantiatedRemoteObject( - consumer = consumer, - file_name = file_name, - object_name = object_name, - eventloop_name = eventloop_name, - id = id - ) - return { - "id" : id, - "db_params" : db_params - } - - @delete('/remote-object/import') - async def del_imported_remote_object(self, id : str): - obj = self._uninstantiated_remote_objects.get(id, None) - if obj is None: - return False - elif isinstance(obj, str): - return await self.zmq_client_pool.async_execute(instance_name = obj, instruction = '/remote-object/import', - arguments = dict(id = obj) ) - else: - self.uninstantiated_remote_objects.pop(id) - return True - - @post('/remote-object/instantiate') - async def new_remote_object(self, id : str, kwargs : typing.Dict[str, typing.Any], db_params : typing.Dict[str, typing.Any]): - uninstantiated_remote_object = self._uninstantiated_remote_objects[id] - consumer = uninstantiated_remote_object.consumer - init_params = consumer.param_descriptors.load_at_init_objects() - for name, value in db_params.items(): - init_params[name].validate_and_adapt(value) - if uninstantiated_remote_object.eventloop_name not in self.zmq_client_pool: - fork_empty_eventloop(instance_name = uninstantiated_remote_object.eventloop_name) - self.zmq_client_pool.register_client(uninstantiated_remote_object.eventloop_name) - await self.zmq_client_pool[uninstantiated_remote_object.eventloop_name].handshake_complete() - await self.zmq_client_pool[uninstantiated_remote_object.eventloop_name].async_execute( - '/remote-object/instantiate', arguments = dict( - file_name = uninstantiated_remote_object.file_name, - object_name = uninstantiated_remote_object.object_name, - kwargs = kwargs), raise_client_side_exception = True - ) - if not kwargs.get('instance_name') in self.zmq_client_pool: - self.zmq_client_pool.register_client(kwargs.get('instance_name')) - await self.zmq_client_pool[kwargs.get('instance_name')].handshake_complete() - await update_resources_using_client(self.server_resources, - self.remote_object_info, self.zmq_client_pool[kwargs.get('instance_name')]) - await update_resources_using_client(self.server_resources, - self.remote_object_info, self.zmq_client_pool[uninstantiated_remote_object.eventloop_name]) - self._uninstantiated_remote_objects.pop(id, None) - - @get('/remote_objects/ping') - async def ping_consumers(self): - return await self.zmq_client_pool.ping_all_servers() - - @get('/remote_objects/state') - async def consumers_state(self): - return await self.zmq_client_pool.async_execute_in_all_remote_objects('/state', context = { - "plain_reply" : True - }) - - @get('/uninstantiated-remote-objects') - async def uninstantiated_remote_objects(self): - organised_reply = await self.zmq_client_pool.async_execute_in_all_eventloops('/uninstantiated-remote-objects/read', - context = { - "plain_reply" : True - }) - organised_reply[self.instance_name] = self._uninstantiated_remote_objects - return organised_reply - - @get('/info/all') - async def info(self): - consumers_state, uninstantiated_remote_objects = await asyncio.gather(self.consumers_state(), - self.uninstantiated_remote_objects()) - return dict( - remoteObjectState = consumers_state, - remoteObjectInfo = self.remote_object_info, - uninstantiatedRemoteObjects = uninstantiated_remote_objects - ) - - - - -class PCHostUtilities(HTTPServerUtilities): - - type : str = 'PC_HOST' - - def __init__(self, db_config_file : str, server_network_interface : str, port : int, **kwargs) -> None: - super().__init__(db_config_file = db_config_file, zmq_client_pool = None, remote_object_info = None, **kwargs) - self.subscribers : typing.List[SubscribedHTTPServers] = [] - self.own_info = SubscribedHTTPServers( - hostname=socket.gethostname(), - IPAddress=get_IP_from_interface(server_network_interface), - port= port, - type=self.type, - https=False - ) - - @post('/subscription') - def subscription(self, hostname : str, port : int, type : str, https : bool, *, request : HTTPServerRequest): - server = SubscribedHTTPServers( - hostname=hostname, - IPAddress=request.remote_ip, - port=port, - type=type, - https=https - ) - self.subscribers.append(server) - - @get('/subscribers') - def get_subscribers(self): - return {"subscribers" : self.subscribers + [self.own_info]} - - @post('/starter/run') - async def starter(self): - pass - - -class PrimaryHostUtilities(PCHostUtilities): - - type : str = 'PRIMARY_HOST' - - class TableBase(DeclarativeBase): - pass - - class http_server(TableBase): - __tablename__ = "http_servers" - - hostname : Mapped[str] = mapped_column(String, primary_key = True) - type : Mapped[str] = mapped_column(String) - port : Mapped[int] = mapped_column(Integer) - IPAddress : Mapped[str] = mapped_column(String) - remote_objects : Mapped[typing.List[str]] = mapped_column(ARRAY(String)) - - def __init__(self, db_config_file : str, server_network_interface : str, port : int, **kwargs) -> None: - super().__init__(db_config_file = db_config_file, server_network_interface = server_network_interface, - port = port, **kwargs) - self.own_info = SubscribedHTTPServers( - hostname = socket.gethostname(), - IPAddress = get_IP_from_interface(server_network_interface), - port = port, - type = self.type, - https=False - ) - -# remote_object_info = [dict( -# instance_name = 'server-util', -# **self.class_info() -# ), -# dict( -# instance_name = 'dashboard-util', -# classname = ReactClientUtilities.__name__, -# script = os.path.dirname(os.path.abspath(inspect.getfile(ReactClientUtilities))) -# )], - -# remote_object_info = [dict( -# instance_name = 'server-util', -# **self.class_info() -# )], - - - -def create_server_tables(serverDB): - engine = create_engine(serverDB) - PrimaryHostUtilities.TableBase.metadata.create_all(engine) - RemoteObjectDB.TableBase.metadata.create_all(engine) - engine.dispose() - -def create_client_tables(clientDB): - engine = create_engine(clientDB) - ReactClientUtilities.TableBase.metadata.create_all(engine) - with Session(engine) as session, session.begin(): - # Pages - session.add(ReactClientUtilities.appsettings( - field = 'dashboardsDeleteWithoutAsking', - value = {'value' : True} - )) - session.add(ReactClientUtilities.appsettings( - field = 'dashboardsShowRecentlyUsed', - value = {'value' : True} - )) - - # login page - session.add(ReactClientUtilities.appsettings( - field = 'loginFooter', - value = {'value' : ''} - )) - session.add(ReactClientUtilities.appsettings( - field = 'loginFooterLink', - value = {'value' : ''} - )) - session.add(ReactClientUtilities.appsettings( - field = 'loginDisplayFooter', - value = {'value' : True} - )) - - # server - session.add(ReactClientUtilities.appsettings( - field = 'serversAllowHTTP', - value = {'value' : False} - )) - - # remote object wizard - session.add(ReactClientUtilities.appsettings( - field = 'remoteObjectViewerConsoleStringifyOutput', - value = {'value' : False} - )) - session.add(ReactClientUtilities.appsettings( - field = 'remoteObjectViewerConsoleDefaultMaxEntries', - value = {'value' : 15} - )) - session.add(ReactClientUtilities.appsettings( - field = 'remoteObjectViewerConsoleDefaultWindowSize', - value = {'value' : 500} - )) - session.add(ReactClientUtilities.appsettings( - field = 'remoteObjectViewerConsoleDefaultFontSize', - value = {'value' : 16} - )) - - session.add(ReactClientUtilities.appsettings( - field = 'logViewerStringifyOutput', - value = {'value' : False} - )) - session.add(ReactClientUtilities.appsettings( - field = 'logViewerDefaultMaxEntries', - value = {'value' : 10} - )) - session.add(ReactClientUtilities.appsettings( - field = 'logViewerDefaultOutputWindowSize', - value = {'value' : 1000} - )) - session.add(ReactClientUtilities.appsettings( - field = 'logViewerDefaultFontSize', - value = {'value' : 16} - )) - - session.commit() - engine.dispose() - - -__all__ = ['ReactClientUtilities'] \ No newline at end of file diff --git a/hololinked/server/logger.py b/hololinked/server/logger.py new file mode 100644 index 0000000..b97de7e --- /dev/null +++ b/hololinked/server/logger.py @@ -0,0 +1,205 @@ +import logging +import typing +import datetime +import threading +import asyncio +import time +from collections import deque + +from .constants import HTTP_METHODS +from .events import Event +from .property import Property +from .properties import Integer, Number +from .thing import Thing as RemoteObject +from .action import action as remote_method + + + +class ListHandler(logging.Handler): + """ + Log history handler. Add and remove this handler to hold a bunch of specific logs. Currently used by execution context + within ``EventLoop`` where one can fetch the execution logs while an action is being executed. + """ + + def __init__(self, log_list : typing.Optional[typing.List] = None): + super().__init__() + self.log_list : typing.List[typing.Dict] = [] if not log_list else log_list + + def emit(self, record : logging.LogRecord): + log_entry = self.format(record) + self.log_list.insert(0, { + 'level' : record.levelname, + 'timestamp' : datetime.datetime.fromtimestamp(record.created).strftime("%Y-%m-%dT%H:%M:%S.%f"), + 'thread_id' : threading.get_ident(), + 'message' : record.msg + }) + + + +class RemoteAccessHandler(logging.Handler, RemoteObject): + """ + Log handler with remote access attached to ``Thing``'s logger if logger_remote_access is True. + The schema of the pushed logs are a list containing a dictionary for each log message with + the following fields: + + .. code-block:: python + + { + "level" : str, + "timestamp" : str, + "thread_id" : int, + "message" : str + } + """ + + def __init__(self, instance_name : str = 'logger', maxlen : int = 500, stream_interval : float = 1.0, + **kwargs) -> None: + """ + Parameters + ---------- + instance_name: str, default 'logger' + instance name of the object, generally only one instance per ``Thing`` necessary, therefore defaults to + 'logger' + maxlen: int, default 500 + history of log entries to store in RAM + stream_interval: float, default 1.0 + when streaming logs using log-events endpoint, this value is the stream interval. + **kwargs: + len_debug: int + length of debug logs, default maxlen/5 + len_info: int + length of info logs, default maxlen/5 + len_warn: int + length of warn logs, default maxlen/5 + len_error: int + length of error logs, default maxlen/5 + len_critical: int + length of critical logs, default maxlen/5 + """ + logging.Handler.__init__(self) + RemoteObject.__init__(self, instance_name=instance_name, **kwargs) + self.set_maxlen(maxlen, **kwargs) + self.stream_interval = stream_interval + self.diff_logs = [] + self.event = Event('log-events') + self._push_events = False + self._events_thread = None + + stream_interval = Number(default=1.0, bounds=(0.025, 60.0), crop_to_bounds=True, step=0.05, + URL_path='/stream-interval', doc="interval at which logs should be published to a client.") + + def get_maxlen(self): + return self._maxlen + + def set_maxlen(self, value, **kwargs): + self._maxlen = value + self._debug_logs = deque(maxlen=kwargs.pop('len_debug', int(value/5))) + self._info_logs = deque(maxlen=kwargs.pop('len_info', int(value/5))) + self._warn_logs = deque(maxlen=kwargs.pop('len_warn', int(value/5))) + self._error_logs = deque(maxlen=kwargs.pop('len_error', int(value/5))) + self._critical_logs = deque(maxlen=kwargs.pop('len_critical', int(value/5))) + self._execution_logs = deque(maxlen=value) + + maxlen = Integer(default=100, bounds=(1, None), crop_to_bounds=True, URL_path='/maxlen', + fget=get_maxlen, fset=set_maxlen, doc="length of execution log history to store") + + + @remote_method(http_method=HTTP_METHODS.POST, URL_path='/events/start') + def push_events(self, scheduling : str = 'threaded', stream_interval : float = 1) -> None: + """ + Push events to client. This method is intended to be called remotely for + debugging the Thing. + + Parameters + ---------- + scheduling: str + 'threaded' or 'async. threaded starts a new thread, async schedules a task to the + main event loop. + stream_interval: float + interval of push in seconds. + """ + self.stream_interval = stream_interval + if scheduling == 'asyncio': + asyncio.get_event_loop().call_soon(lambda : asyncio.create_task(self._async_push_diff_logs())) + elif scheduling == 'threading': + if self._events_thread is not None: # dont create again if one is already running + self._events_thread = threading.Thread(target=self._push_diff_logs) + self._events_thread.start() + else: + raise ValueError(f"scheduling can only be 'threaded' or 'async'. Given value {scheduling}") + + @remote_method(http_method=HTTP_METHODS.POST, URL_path='/events/stop') + def stop_events(self) -> None: + """ + stop pushing events + """ + self._push_events = False + if self._events_thread: # coroutine variant will resolve automatically + self._events_thread.join() + self._owner.logger.debug(f"joined log event source with thread-id {self._events_thread.ident}.") + self._events_thread = None + + def emit(self, record : logging.LogRecord): + log_entry = self.format(record) + info = { + 'level' : record.levelname, + 'timestamp' : datetime.datetime.fromtimestamp(record.created).strftime("%Y-%m-%dT%H:%M:%S.%f"), + 'thread_id' : threading.get_ident(), + 'message' : record.msg + } + if record.levelno < logging.INFO: + self._debug_logs.appendleft(info) + elif record.levelno >= logging.INFO and record.levelno < logging.WARN: + self._info_logs.appendleft(info) + elif record.levelno >= logging.WARN and record.levelno < logging.ERROR: + self._warn_logs.appendleft(info) + elif record.levelno >= logging.ERROR and record.levelno < logging.CRITICAL: + self._error_logs.appendleft(info) + elif record.levelno >= logging.CRITICAL: + self._critical_logs.appendleft(info) + self._execution_logs.appendleft(info) + + if self._push_events: + self.diff_logs.insert(0, info) + + def _push_diff_logs(self) -> None: + self._push_events = True + while self._push_events: + time.sleep(self.stream_interval) + if len(self.diff_logs) > 0: + self.event.push(self.diff_logs) + self.diff_logs.clear() + # give time to collect final logs with certainty + self._owner.logger.info(f"ending log event source with thread-id {threading.get_ident()}.") + + async def _async_push_diff_logs(self) -> None: + while self._push_events: + await asyncio.sleep(self.stream_interval) + self.event.push(self.diff_logs) + self.diff_logs.clear() + self._owner.logger.info(f"ending log events.") + + debug_logs = Property(readonly=True, URL_path='/logs/debug', fget=lambda self: self._debug_logs, + doc="logs at logging.DEBUG level") + + warn_logs = Property(readonly=True, URL_path='/logs/warn', fget=lambda self: self._warn_logs, + doc="logs at logging.WARN level") + + info_logs = Property(readonly=True, URL_path='/logs/info', fget=lambda self: self._info_logs, + doc="logs at logging.INFO level") + + error_logs = Property(readonly=True, URL_path='/logs/error', fget=lambda self: self._error_logs, + doc="logs at logging.ERROR level") + + critical_logs = Property(readonly=True, URL_path='/logs/critical', fget=lambda self: self._critical_logs, + doc="logs at logging.CRITICAL level") + + execution_logs = Property(readonly=True, URL_path='/logs/execution', fget=lambda self: self._execution_logs, + doc="logs at all levels accumulated in order of collection/execution") + + + +__all__ = [ + ListHandler.__name__, + RemoteAccessHandler.__name__ +] \ No newline at end of file diff --git a/hololinked/server/path_converter.py b/hololinked/server/path_converter.py deleted file mode 100644 index e0340b2..0000000 --- a/hololinked/server/path_converter.py +++ /dev/null @@ -1,213 +0,0 @@ -# copied or adapted from Starlette - https://github.com/encode/starlette - -# see following license -""" -Copyright © 2018, [Encode OSS Ltd](https://www.encode.io/). -All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions are met: - -* Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. - -* Redistributions in binary form must reproduce the above copyright notice, - this list of conditions and the following disclaimer in the documentation - and/or other materials provided with the distribution. - -* Neither the name of the copyright holder nor the names of its - contributors may be used to endorse or promote products derived from - this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" -AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE -FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL -DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR -SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER -CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, -OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -""" - -import typing -import re -import math -import typing -import uuid - - -T = typing.TypeVar("T") - - -class Convertor(typing.Generic[T]): - regex: typing.ClassVar[str] = "" - - def convert(self, value: str) -> T: - raise NotImplementedError() # pragma: no cover - - def to_string(self, value: T) -> str: - raise NotImplementedError() # pragma: no cover - - -class StringConvertor(Convertor): - regex = "[^/]+" - - def convert(self, value: str) -> str: - return value - - def to_string(self, value: str) -> str: - value = str(value) - assert "/" not in value, "May not contain path separators" - assert value, "Must not be empty" - return value - - -class PathConvertor(Convertor): - regex = ".*" - - def convert(self, value: str) -> str: - return str(value) - - def to_string(self, value: str) -> str: - return str(value) - - -class IntegerConvertor(Convertor): - regex = "[0-9]+" - - def convert(self, value: str) -> int: - return int(value) - - def to_string(self, value: int) -> str: - value = int(value) - assert value >= 0, "Negative integers are not supported" - return str(value) - - -class FloatConvertor(Convertor): - regex = r"[0-9]+(\.[0-9]+)?" - - def convert(self, value: str) -> float: - return float(value) - - def to_string(self, value: float) -> str: - value = float(value) - assert value >= 0.0, "Negative floats are not supported" - assert not math.isnan(value), "NaN values are not supported" - assert not math.isinf(value), "Infinite values are not supported" - return ("%0.20f" % value).rstrip("0").rstrip(".") - - -class UUIDConvertor(Convertor): - regex = "[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}" - - def convert(self, value: str) -> uuid.UUID: - return uuid.UUID(value) - - def to_string(self, value: uuid.UUID) -> str: - return str(value) - - -CONVERTOR_TYPES = { - "str": StringConvertor(), - "path": PathConvertor(), - "int": IntegerConvertor(), - "float": FloatConvertor(), - "uuid": UUIDConvertor(), -} - - -def register_url_convertor(key : str, convertor : Convertor) -> None: - assert isinstance(convertor, Convertor), "Only subclass of convertor is allowed" - CONVERTOR_TYPES[key] = convertor - - -PARAM_REGEX = re.compile("{([a-zA-Z_][a-zA-Z0-9_]*)(:[a-zA-Z_][a-zA-Z0-9_]*)?}") - - -def compile_path( - path: str, -): - """ - Given a path string, like: "/{username:str}", - or a host string, like: "{subdomain}.mydomain.org", return a three-tuple - of (regex, format, {param_name:convertor}). - - regex: "/(?P[^/]+)" - format: "/{username}" - convertors: {"username": StringConvertor()} - """ - is_host = not path.startswith("/") - - path_regex = "^" - path_format = "" - duplicated_params = set() - - idx = 0 - param_convertors = {} - for match in PARAM_REGEX.finditer(path): - param_name, convertor_type = match.groups("str") - convertor_type = convertor_type.lstrip(":") - assert ( - convertor_type in CONVERTOR_TYPES - ), f"Unknown path convertor '{convertor_type}'" - convertor = CONVERTOR_TYPES[convertor_type] - - path_regex += re.escape(path[idx : match.start()]) - path_regex += f"(?P<{param_name}>{convertor.regex})" - - path_format += path[idx : match.start()] - path_format += "{%s}" % param_name - - if param_name in param_convertors: - duplicated_params.add(param_name) - - param_convertors[param_name] = convertor - - idx = match.end() - - if duplicated_params: - names = ", ".join(sorted(duplicated_params)) - ending = "s" if len(duplicated_params) > 1 else "" - raise ValueError(f"Duplicated param name{ending} {names} at path {path}") - - if is_host: - # Align with `Host.matches()` behavior, which ignores port. - hostname = path[idx:].split(":")[0] - path_regex += re.escape(hostname) + "$" - else: - path_regex += re.escape(path[idx:]) + "$" - - path_format += path[idx:] - return re.compile(path_regex), path_format, param_convertors - - -def replace_params( - path: str, - param_convertors: typing.Dict[str, Convertor], - path_params: typing.Dict[str, str], - ) -> typing.Tuple[str, dict]: - for key, value in list(path_params.items()): - if "{" + key + "}" in path: - convertor = param_convertors[key] - value = convertor.to_string(value) - path = path.replace("{" + key + "}", value) - path_params.pop(key) - return path, path_params - - -def extract_path_parameters(full_path : str, path_regex : typing.Pattern, param_convertors : typing.Dict[str, Convertor] ): - match = path_regex.fullmatch(full_path) - if match: - matched_params = match.groupdict() - for key, value in matched_params.items(): - matched_params[key] = param_convertors[key].convert(value) - return matched_params - return None - - - - - - diff --git a/hololinked/server/remote_parameters.py b/hololinked/server/properties.py similarity index 92% rename from hololinked/server/remote_parameters.py rename to hololinked/server/properties.py index 3bc316c..a0e385f 100644 --- a/hololinked/server/remote_parameters.py +++ b/hololinked/server/properties.py @@ -4,8 +4,6 @@ import datetime as dt import typing import numbers -import sys -import functools import collections.abc from enum import Enum from collections import OrderedDict @@ -16,29 +14,24 @@ from ..param.parameters import (TypeConstrainedList, TypeConstrainedDict, abbreviate_paths, TypedKeyMappingsConstrainedDict, resolve_path, concrete_descendents, named_objs) -from .remote_parameter import RemoteParameter -from .constants import USE_OBJECT_NAME, GET, PUT +from .property import Property +from .constants import USE_OBJECT_NAME, HTTP_METHODS +GET = HTTP_METHODS.GET +PUT = HTTP_METHODS.PUT -class String(RemoteParameter): - """ - A string parameter with a default value and optional regular expression (regex) matching. - - Example of using a regex to implement IPv4 address matching:: - class IPAddress(String): - '''IPv4 address as a string (dotted decimal notation)''' - def __init__(self, default="0.0.0.0", allow_None=False, **kwargs): - ip_regex = r'^((25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$' - super(IPAddress, self).__init__(default=default, regex=ip_regex, **kwargs) +class String(Property): + """ + A string property with optional regular expression (regex) matching. """ __slots__ = ['regex'] def __init__(self, default : typing.Optional[str] = "", *, regex : typing.Optional[str] = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, @@ -49,7 +42,7 @@ def __init__(self, default : typing.Optional[str] = "", *, regex : typing.Option deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.regex = regex def validate_and_adapt(self, value : typing.Any) -> str: @@ -98,11 +91,11 @@ def isinstance(cls, value : typing.Any, regex : typing.Optional[str] = None, all class Bytes(String): """ - A bytes parameter with a default value and optional regular + A bytes property with a default value and optional regular expression (regex) matching. - Similar to the string parameter, but instead of type basestring - this parameter only allows objects of type bytes (e.g. b'bytes'). + Similar to the string property, but instead of type basestring + this property only allows objects of type bytes (e.g. b'bytes'). """ @classmethod def _assert(obj, value : typing.Any, regex : typing.Optional[bytes] = None, allow_None : bool = False) -> None: @@ -133,14 +126,14 @@ def _assert(obj, value : typing.Any, regex : typing.Optional[bytes] = None, allo -class IPAddress(RemoteParameter): - +class IPAddress(Property): + __slots__ = ['allow_localhost', 'allow_ipv4', 'allow_ipv6'] def __init__(self, default : typing.Optional[str] = "0.0.0.0", *, allow_ipv4 : bool = True, allow_ipv6 : bool = True, allow_localhost : bool = True, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, allow_None : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, @@ -151,7 +144,7 @@ def __init__(self, default : typing.Optional[str] = "0.0.0.0", *, allow_ipv4 : b deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.allow_localhost = allow_localhost self.allow_ipv4 = allow_ipv4 self.allow_ipv6 = allow_ipv6 @@ -328,12 +321,12 @@ def isipv6cidr(obj, value : str) -> bool: -class Number(RemoteParameter): +class Number(Property): """ - A numeric parameter with a default value and optional bounds. + A numeric property with a default value and optional bounds. There are two types of bounds: ``bounds`` and - ``softbounds``. ``bounds`` are hard bounds: the parameter must + ``softbounds``. ``bounds`` are hard bounds: the property must have a value within the specified range. The default bounds are (None,None), meaning there are actually no hard bounds. One or both bounds can be set by specifying a value @@ -347,7 +340,7 @@ class Number(RemoteParameter): bounds, or one that is not numeric, results in an exception. As a special case, if allow_None=True (which is true by default if - the parameter has a default of None when declared) then a value + the property has a default of None when declared) then a value of None is also allowed. A separate function set_in_bounds() is provided that will @@ -355,7 +348,7 @@ class Number(RemoteParameter): in, for instance, a GUI. ``softbounds`` are present to indicate the typical range of - the parameter, but are not enforced. Setting the soft bounds + the property, but are not enforced. Setting the soft bounds allows, for instance, a GUI to know what values to display on sliders for the Number. @@ -364,12 +357,14 @@ class Number(RemoteParameter): """ + type = 'number' + __slots__ = ['bounds', 'inclusive_bounds', 'crop_to_bounds', 'dtype', 'step'] def __init__(self, default : typing.Optional[typing.Union[float, int]] = 0.0, *, bounds : typing.Optional[typing.Tuple] = None, crop_to_bounds : bool = False, inclusive_bounds : typing.Tuple = (True,True), step : typing.Any = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, @@ -379,7 +374,7 @@ def __init__(self, default : typing.Optional[typing.Union[float, int]] = 0.0, *, allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.bounds = bounds self.crop_to_bounds = crop_to_bounds self.inclusive_bounds = inclusive_bounds @@ -398,7 +393,7 @@ def set_in_bounds(self, obj : typing.Union[Parameterized, typing.Any], value : t def _crop_to_bounds(self, value : typing.Union[int, float]) -> typing.Union[int, float]: """ Return the given value cropped to be within the hard bounds - for this parameter. + for this property. If a numeric value is passed in, check it is within the hard bounds. If it is larger than the high bound, return the high @@ -453,15 +448,15 @@ def _assert(obj, value, dtype : typing.Tuple, bounds : typing.Optional[typing.Tu raise_ValueError("given value must be at most {}, not {}.".format(vmax, value), obj) else: if not value < vmax: - raise_ValueError("Parameter must be less than {}, not {}.".format(vmax, value), obj) + raise_ValueError("Property must be less than {}, not {}.".format(vmax, value), obj) if vmin is not None: if incmin is True: if not value >= vmin: - raise_ValueError("Parameter must be at least {}, not {}.".format(vmin, value), obj) + raise_ValueError("Property must be at least {}, not {}.".format(vmin, value), obj) else: if not value > vmin: - raise_ValueError("Parameter must be greater than {}, not {}.".format(vmin, value), obj) + raise_ValueError("Property must be greater than {}, not {}.".format(vmin, value), obj) return value def _validate_step(self, value : typing.Any) -> None: @@ -499,15 +494,15 @@ def isnumber(cls, value : typing.Any) -> bool: class Integer(Number): - """Numeric Parameter required to be an Integer""" + + """Numeric Property required to be an Integer""" def __init__(self, default : typing.Optional[int] = 0, *, bounds : typing.Optional[typing.Tuple] = None, crop_to_bounds : bool = False, inclusive_bounds : typing.Tuple = (True,True), step : typing.Any = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -516,7 +511,7 @@ def __init__(self, default : typing.Optional[int] = 0, *, bounds : typing.Option deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote, step=step) self.dtype = (int,) def _validate_step(self, step : int): @@ -525,15 +520,14 @@ def _validate_step(self, step : int): -class Boolean(RemoteParameter): - """Binary or tristate Boolean Parameter.""" +class Boolean(Property): + """Binary or tristate Boolean Property.""" def __init__(self, default : typing.Optional[bool] = False, *, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -541,7 +535,7 @@ def __init__(self, default : typing.Optional[bool] = False, *, allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) def validate_and_adapt(self, value : typing.Any) -> bool: if not isinstance(value, bool): @@ -550,23 +544,22 @@ def validate_and_adapt(self, value : typing.Any) -> bool: -class Iterable(RemoteParameter): - """A tuple or list Parameter (e.g. ('a',7.6,[3,5])) with a fixed tuple length.""" +class Iterable(Property): + """A tuple or list Property (e.g. ('a',7.6,[3,5])) with a fixed tuple length.""" __slots__ = ['bounds', 'length', 'item_type', 'dtype'] def __init__(self, default : typing.Any, *, bounds : typing.Optional[typing.Tuple[int, int]] = None, length : typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None, deepcopy_default : bool = False, allow_None : bool = False, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: """ - Initialize a tuple parameter with a fixed length (number of + Initialize a tuple property with a fixed length (number of elements). The length is determined by the initial default value, if any, and must be supplied explicitly otherwise. The length is not allowed to change after instantiation. @@ -575,7 +568,7 @@ def __init__(self, default : typing.Any, *, bounds : typing.Optional[typing.Tupl allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.bounds = bounds self.length = length self.item_type = item_type @@ -626,10 +619,9 @@ def __init__(self, default : typing.Any, *, bounds : typing.Optional[typing.Tupl length: typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None, accept_list : bool = False, deepcopy_default : bool = False, allow_None : bool = False, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -638,7 +630,7 @@ def __init__(self, default : typing.Any, *, bounds : typing.Optional[typing.Tupl deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.accept_list = accept_list self.dtype = (tuple,) # re-assigned @@ -664,7 +656,7 @@ def deserialize(cls, value): class List(Iterable): """ - Parameter whose value is a list of objects, usually of a specified type. + Property whose value is a list of objects, usually of a specified type. The bounds allow a minimum and/or maximum length of list to be enforced. If the item_type is non-None, all @@ -681,10 +673,9 @@ def __init__(self, default: typing.Any, *, bounds : typing.Optional[typing.Tuple length : typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None, accept_tuple : bool = False, deepcopy_default : bool = False, allow_None : bool = False, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, @@ -694,7 +685,7 @@ def __init__(self, default: typing.Any, *, bounds : typing.Optional[typing.Tuple per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.accept_tuple = accept_tuple self.dtype = list @@ -706,9 +697,9 @@ def validate_and_adapt(self, value: typing.Any) -> typing.Tuple: -class Callable(RemoteParameter): +class Callable(Property): """ - Parameter holding a value that is a callable object, such as a function. + Property holding a value that is a callable object, such as a function. A keyword argument instantiate=True should be provided when a function object is used that might have state. On the other hand, @@ -723,29 +714,28 @@ def validate_and_adapt(self, value : typing.Any) -> typing.Callable: -class Composite(RemoteParameter): +class Composite(Property): """ - A Parameter that is a composite of a set of other attributes of the class. + A Property that is a composite of a set of other attributes of the class. The constructor argument 'attribs' takes a list of attribute - names, which may or may not be Parameters. Getting the parameter + names, which may or may not be Properties. Getting the property returns a list of the values of the constituents of the composite, - in the order specified. Likewise, setting the parameter takes a + in the order specified. Likewise, setting the property takes a sequence of values and sets the value of the constituent attributes. - This Parameter type has not been tested with watchers and + This Property type has not been tested with watchers and dependencies, and may not support them properly. """ __slots__ = ['attribs'] - def __init__(self, attribs : typing.List[typing.Union[str, Parameter]], *, + def __init__(self, attribs : typing.List[typing.Union[str, Property]], *, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -753,7 +743,7 @@ def __init__(self, attribs : typing.List[typing.Union[str, Parameter]], *, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.attribs = [] if attribs is not None: for attrib in attribs: @@ -770,7 +760,7 @@ def __get__(self, obj, objtype) -> typing.List[typing.Any]: def validate_and_adapt(self, value): if not len(value) == len(self.attribs): - raise_ValueError("Compound parameter got the wrong number of values (needed {}, but got {}).".format( + raise_ValueError("Compound property got the wrong number of values (needed {}, but got {}).".format( len(self.attribs), len(value)), self) return value @@ -780,9 +770,9 @@ def _post_setter(self, obj, val): -class SelectorBase(RemoteParameter): +class SelectorBase(Property): """ - Parameter whose value must be chosen from a list of possibilities. + Property whose value must be chosen from a list of possibilities. Subclasses must implement get_range(). """ @@ -797,7 +787,7 @@ def range(self): class Selector(SelectorBase): """ - Parameter whose value must be one object from a list of possible objects. + Property whose value must be one object from a list of possible objects. By default, if no default is specified, picks the first object from the provided set of objects, as long as the objects are in an @@ -816,7 +806,7 @@ class Selector(SelectorBase): The list of objects can be supplied as a list (appropriate for selecting among a set of strings, or among a set of objects with a - "name" parameter), or as a (preferably ordered) dictionary from + "name" property), or as a (preferably ordered) dictionary from names to objects. If a dictionary is supplied, the objects will need to be hashable so that their names can be looked up from the object value. @@ -828,10 +818,9 @@ class Selector(SelectorBase): # existing objects, therefore instantiate is False by default. def __init__(self, *, objects : typing.List[typing.Any], default : typing.Any, empty_default : bool = False, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, @@ -840,7 +829,7 @@ def __init__(self, *, objects : typing.List[typing.Any], default : typing.Any, e allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) if objects is None: objects = [] autodefault = None @@ -868,7 +857,7 @@ def validate_and_adapt(self, value: typing.Any) -> typing.Any: @property def range(self): """ - Return the possible objects to which this parameter could be set. + Return the possible objects to which this property could be set. (Returns the dictionary {object.name:object}.) """ @@ -881,7 +870,7 @@ def range(self): class ClassSelector(SelectorBase): """ - Parameter allowing selection of either a subclass or an instance of a given set of classes. + Property allowing selection of either a subclass or an instance of a given set of classes. By default, requires an instance, but if isinstance=False, accepts a class instead. Both class and instance values respect the instantiate slot, though it matters only for isinstance=True. @@ -891,10 +880,9 @@ class ClassSelector(SelectorBase): def __init__(self, *, class_ , default : typing.Any, isinstance : bool = True, deepcopy_default : bool = False, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -902,7 +890,7 @@ def __init__(self, *, class_ , default : typing.Any, isinstance : bool = True, d allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.class_ = class_ self.isinstance = isinstance @@ -917,18 +905,18 @@ def validate_and_adapt(self, value): return if self.isinstance: if not isinstance(value, self.class_): - raise_ValueError("{} parameter {} value must be an instance of {}, not {}.".format( + raise_ValueError("{} property {} value must be an instance of {}, not {}.".format( self.__class__.__name__, self.name, self._get_class_name(), value), self) else: if not issubclass(value, self.class_): - raise_ValueError("{} parameter {} must be a subclass of {}, not {}.".format( + raise_ValueError("{} property {} must be a subclass of {}, not {}.".format( self.__class__.__name__, self.name, self._get_class_name(), value.__name__), self) return value @property def range(self): """ - Return the possible types for this parameter's value. + Return the possible types for this property's value. (I.e. return `{name: }` for all classes that are concrete_descendents() of `self.class_`.) @@ -960,10 +948,9 @@ class TupleSelector(Selector): def __init__(self, *, objects : typing.List, default : typing.Any, accept_list : bool = True, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -972,7 +959,7 @@ def __init__(self, *, objects : typing.List, default : typing.Any, accept_list : deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.accept_list = accept_list def validate_and_adapt(self, value : typing.Any): @@ -1000,9 +987,9 @@ def validate_and_adapt(self, value : typing.Any): # - use resolve_path(path_to_file=False) for paths to existing folders to be read, # and normalize_path() for paths to new files to be written. -class Path(RemoteParameter): +class Path(Property): """ - Parameter that can be set to a string specifying the path of a file or folder. + Property that can be set to a string specifying the path of a file or folder. The string should be specified in UNIX style, but it will be returned in the format of the user's operating system. Please use @@ -1024,10 +1011,9 @@ class Path(RemoteParameter): def __init__(self, default : typing.Any = '', *, search_paths : typing.Optional[str] = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1036,7 +1022,7 @@ def __init__(self, default : typing.Any = '', *, search_paths : typing.Optional[ deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) if isinstance(search_paths, str): self.search_paths = [search_paths] elif isinstance(search_paths, list): @@ -1071,7 +1057,7 @@ def __getstate__(self): class Filename(Path): """ - Parameter that can be set to a string specifying the path of a file. + Property that can be set to a string specifying the path of a file. The string should be specified in UNIX style, but it will be returned in the format of the user's operating system. @@ -1093,7 +1079,7 @@ def _resolve(self, path): class Foldername(Path): """ - Parameter that can be set to a string specifying the path of a folder. + Property that can be set to a string specifying the path of a folder. The string should be specified in UNIX style, but it will be returned in the format of the user's operating system. @@ -1133,10 +1119,9 @@ class FileSelector(Selector): def __init__(self, default : typing.Any, *, objects : typing.List, path : str = "", doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1145,7 +1130,7 @@ def __init__(self, default : typing.Any, *, objects : typing.List, path : str = deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.path = path # update is automatically called def _post_slot_set(self, slot: str, old : typing.Any, value : typing.Any) -> None: @@ -1173,10 +1158,9 @@ class MultiFileSelector(FileSelector): def __init__(self, default : typing.Any, *, path : str = "", doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - label : typing.Optional[str] = None, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1185,7 +1169,7 @@ def __init__(self, default : typing.Any, *, path : str = "", deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) def update(self): self.objects = sorted(glob.glob(self.path)) @@ -1197,16 +1181,15 @@ def update(self): class Date(Number): """ - Date parameter of datetime or date type. + Date property of datetime or date type. """ def __init__(self, default, *, bounds : typing.Union[typing.Tuple, None] = None, crop_to_bounds : bool = False, inclusive_bounds : typing.Tuple = (True,True), step : typing.Any = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1216,7 +1199,7 @@ def __init__(self, default, *, bounds : typing.Union[typing.Tuple, None] = None, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.dtype = dt_types def _validate_step(self, val): @@ -1241,16 +1224,15 @@ def deserialize(cls, value): class CalendarDate(Number): """ - Parameter specifically allowing dates (not datetimes). + Property specifically allowing dates (not datetimes). """ def __init__(self, default, *, bounds : typing.Union[typing.Tuple, None] = None, crop_to_bounds : bool = False, inclusive_bounds : typing.Tuple = (True,True), step : typing.Any = None, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1260,7 +1242,7 @@ def __init__(self, default, *, bounds : typing.Union[typing.Tuple, None] = None, deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.dtype = dt.date def _validate_step(self, step): @@ -1281,9 +1263,9 @@ def deserialize(cls, value): -class CSS3Color(RemoteParameter): +class CSS3Color(Property): """ - Color parameter defined as a hex RGB string with an optional # + Color property defined as a hex RGB string with an optional # prefix or (optionally) as a CSS3 color name. """ @@ -1327,10 +1309,9 @@ class CSS3Color(RemoteParameter): __slots__ = ['allow_named'] def __init__(self, default, *, allow_named : bool = True, doc : typing.Optional[str] = None, constant : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - readonly : bool = False, allow_None : bool = False, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, @@ -1340,14 +1321,14 @@ def __init__(self, default, *, allow_named : bool = True, doc : typing.Optional deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.allow_named = allow_named def validate_and_adapt(self, value : typing.Any): if (self.allow_None and value is None): return if not isinstance(value, str): - raise ValueError("Color parameter %r expects a string value, " + raise ValueError("Color property %r expects a string value, " "not an object of type %s." % (self.name, type(value))) if self.allow_named and value in self._named_colors: return @@ -1367,12 +1348,11 @@ class Range(Tuple): def __init__(self, default : typing.Optional[typing.Tuple] = None, *, bounds: typing.Optional[typing.Tuple[int, int]] = None, length : typing.Optional[int] = None, item_type : typing.Optional[typing.Tuple] = None, - softbounds=None, inclusive_bounds=(True,True), step=None, + softbounds=None, inclusive_bounds=(True,True), step=None, doc : typing.Optional[str] = None, constant : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - readonly : bool = False, allow_None : bool = False, label : typing.Optional[str] = None, per_instance_descriptor : bool = False, deepcopy_default : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, @@ -1385,7 +1365,7 @@ def __init__(self, default : typing.Optional[typing.Tuple] = None, *, bounds: ty deepcopy_default=deepcopy_default, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) def validate_and_adapt(self, value : typing.Any) -> typing.Tuple: raise NotImplementedError("Range validation not implemented") @@ -1401,7 +1381,7 @@ def _validate_bounds(self, val, bounds, inclusive_bounds): too_low = (vmin is not None) and (v < vmin if incmin else v <= vmin) too_high = (vmax is not None) and (v > vmax if incmax else v >= vmax) if too_low or too_high: - raise ValueError("Range parameter %r's %s bound must be in range %s." + raise ValueError("Range property %r's %s bound must be in range %s." % (self.name, bound, self.rangestr)) @property @@ -1429,17 +1409,17 @@ def _validate_value(self, val, allow_None): return if not isinstance(val, tuple): - raise ValueError("DateRange parameter %r only takes a tuple value, " + raise ValueError("DateRange property %r only takes a tuple value, " "not %s." % (self.name, type(val).__name__)) for n in val: if isinstance(n, dt_types): continue - raise ValueError("DateRange parameter %r only takes date/datetime " + raise ValueError("DateRange property %r only takes date/datetime " "values, not type %s." % (self.name, type(n).__name__)) start, end = val if not end >= start: - raise ValueError("DateRange parameter %r's end datetime %s " + raise ValueError("DateRange property %r's end datetime %s " "is before start datetime %s." % (self.name, val[1], val[0])) @@ -1487,12 +1467,12 @@ def _validate_value(self, val, allow_None): for n in val: if not isinstance(n, dt.date): - raise ValueError("CalendarDateRange parameter %r only " + raise ValueError("CalendarDateRange property %r only " "takes date types, not %s." % (self.name, val)) start, end = val if not end >= start: - raise ValueError("CalendarDateRange parameter %r's end date " + raise ValueError("CalendarDateRange property %r's end date " "%s is before start date %s." % (self.name, val[1], val[0])) @@ -1519,10 +1499,9 @@ class TypedList(ClassSelector): def __init__(self, default : typing.Optional[typing.List[typing.Any]] = None, *, item_type : typing.Any = None, deepcopy_default : bool = True, allow_None : bool = True, bounds : tuple = (0,None), doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1534,7 +1513,7 @@ def __init__(self, default : typing.Optional[typing.List[typing.Any]] = None, *, per_instance_descriptor=per_instance_descriptor, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) self.item_type = item_type self.bounds = bounds @@ -1563,10 +1542,9 @@ class TypedDict(ClassSelector): def __init__(self, default : typing.Optional[typing.Dict] = None, *, key_type : typing.Any = None, item_type : typing.Any = None, deepcopy_default : bool = True, allow_None : bool = True, bounds : tuple = (0, None), doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, @@ -1581,7 +1559,7 @@ def __init__(self, default : typing.Optional[typing.Dict] = None, *, key_type : doc=doc, constant=constant, readonly=readonly, allow_None=allow_None, fget=fget, fset=fset, fdel=fdel, per_instance_descriptor=per_instance_descriptor, class_member=class_member, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) def __set__(self, obj, value): if value is not None: @@ -1607,10 +1585,9 @@ def __init__(self, default : typing.Optional[typing.Dict[typing.Any, typing.Any] allow_unspecified_keys : bool = True, bounds : tuple = (0, None), deepcopy_default : bool = True, allow_None : bool = True, doc : typing.Optional[str] = None, constant : bool = False, readonly : bool = False, - URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), + URL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT), remote : bool = True, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, per_instance_descriptor : bool = False, class_member : bool = False, fget : typing.Optional[typing.Callable] = None, fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, precedence : typing.Optional[float] = None) -> None: @@ -1626,7 +1603,7 @@ def __init__(self, default : typing.Optional[typing.Dict[typing.Any, typing.Any] allow_None=allow_None, per_instance_descriptor=per_instance_descriptor, class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence, URL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist, - db_init=db_init, db_commit=db_commit) + db_init=db_init, db_commit=db_commit, remote=remote) def __set__(self, obj, value): if value is not None: diff --git a/hololinked/server/property.py b/hololinked/server/property.py new file mode 100644 index 0000000..6f6a6e6 --- /dev/null +++ b/hololinked/server/property.py @@ -0,0 +1,319 @@ +import typing +from types import FunctionType, MethodType +from enum import Enum + +from ..param.parameterized import Parameter, ClassParameters +from .data_classes import RemoteResourceInfoValidator +from .constants import USE_OBJECT_NAME, HTTP_METHODS +from .events import Event + + + +class Property(Parameter): + """ + Initialize a new Property object and to get/set/delete an object/instance attribute. Please note the capital 'P' in + Property to differentiate from python's own ``property``. ``Property`` objects are similar to python ``property`` + but not a subclass of it due to limitations and redundancy. + + Parameters + ---------- + + default: None or corresponding to property type + The default value of the property. + + doc: str, default empty + docstring explaining what this property represents. + + constant: bool, default False + if True, the Property value can be changed only once when ``allow_None`` is set to True. The + value is otherwise constant on the ``Thing`` instance. + + readonly: bool, default False + if True, the Property value cannot be changed by setting the attribute at the class or instance + levels at all. Either the value is fetched always or a getter method is executed which may still generate dynamic + values at each get/read operation. + + allow_None: bool, default False + if True, None is accepted as a valid value for this Property, in addition to any other values that are + allowed. + + URL_path: str, uses object name by default + resource locator under which the attribute is accessible through HTTP. when value is supplied, the variable name + is used and underscores are replaced with dash + + http_method: tuple, default ("GET", "PUT", "DELETE") + http methods for read, write, delete respectively + + observable: bool, default False + set to True to receive change events. Supply a function if interested to evaluate on what conditions the change + event must be emitted. Default condition is a plain not-equal-to operator. + + state: str | Enum, default None + state of state machine where property write can be executed + + db_persist: bool, default False + if True, every write is stored in database and property value persists ``Thing`` instance destruction and creation. + The loaded value from database is written into the property at ``Thing.__post_init__``. + set ``Thing.use_default_db`` to True, to avoid setting up a database or supply a ``db_config_file``. + + db_init: bool, default False + if True, property's first value is loaded from database and written using setter. + Further writes are not written to database. if ``db_persist`` is True, this value is ignored. + + db_commit: bool, + if True, all write values are stored to database. The database value is not loaded at ``Thing.__post_init__()``. + if db_persist is True, this value is ignored. + + fget: Callable, default None + custom getter method, mandatory when setter method is also custom supplied. + + fset: Callable, default None + custom setter method + + fdel: Callable, default None + custom deleter method + + remote: bool, default True + set to false to make the property local/not remotely accessible + + label: str, default extracted from object name + optional text label to be used when this Property is shown in a listing. If no label is supplied, + the attribute name for this property in the owning ``Thing`` object is used. + + metadata: dict, default None + store your own JSON compatible metadata for the property which gives useful (and modifiable) information + about the property. Properties operate using slots which means you cannot set foreign attributes on this object + normally. This metadata dictionary should overcome this limitation. + + per_instance_descriptor: bool, default False + whether a separate Property instance will be created for every ``Thing`` instance. True by default. + If False, all instances of a ```Thing``` class will share the same Property object, including all validation + attributes (bounds, allow_None etc.). + + deepcopy_default: bool, default False + controls whether the default value of this Property will be deepcopied when a ``Thing`` object is instantiated (if + True), or if the single default value will be shared by all ``Thing`` instances (if False). For an immutable + Property value, it is best to leave deep_copy at the default of False. For a mutable Property value, + for example - lists and dictionaries, the default of False is also appropriate if you want all instances to share + the same value state, e.g. if they are each simply referring to a single global object like a singleton. + If instead each ``Thing`` should have its own independently mutable value, deep_copy should be set to + True. This setting is similar to using ``field``'s ``default_factory`` in python dataclasses. + + class_member : bool, default False + when True, property is set on ``Thing`` class instead of ``Thing`` instance. + + precedence: float, default None + a numeric value, usually in the range 0.0 to 1.0, which allows the order of Properties in a class to be defined in + a listing or e.g. in GUI menus. A negative precedence indicates a property that should be hidden in such listings. + + """ + + __slots__ = ['db_persist', 'db_init', 'db_commit', 'metadata', '_remote_info', 'observable', + '_observable_event', 'fcomparator'] + + # RPC only init - no HTTP methods for those who dont like + @typing.overload + def __init__(self, default: typing.Any = None, *, doc : typing.Optional[str] = None, constant : bool = False, + readonly : bool = False, allow_None : bool = False, observable : bool = False, + state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, remote : bool = True, + class_member : bool = False, fget : typing.Optional[typing.Callable] = None, + fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, + ) -> None: + ... + + @typing.overload + def __init__(self, default: typing.Any = None, *, doc : typing.Optional[str] = None, constant : bool = False, + readonly : bool = False, allow_None : bool = False, URL_path : str = USE_OBJECT_NAME, + http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (HTTP_METHODS.GET, HTTP_METHODS.PUT), + observable : bool = False, state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, remote : bool = True, + class_member : bool = False, fget : typing.Optional[typing.Callable] = None, + fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, + metadata : typing.Optional[typing.Dict] = None + ) -> None: + ... + + @typing.overload + def __init__(self, default: typing.Any = None, *, doc : typing.Optional[str] = None, constant : bool = False, + readonly : bool = False, allow_None : bool = False, + URL_path : str = USE_OBJECT_NAME, + http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (HTTP_METHODS.GET, HTTP_METHODS.PUT), + observable : bool = False, change_comparator : typing.Optional[typing.Union[FunctionType, MethodType]] = None, + state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, remote : bool = True, + class_member : bool = False, fget : typing.Optional[typing.Callable] = None, + fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, + fcomparator : typing.Optional[typing.Callable] = None, + deepcopy_default : bool = False, per_instance_descriptor : bool = False, + precedence : typing.Optional[float] = None, metadata : typing.Optional[typing.Dict] = None + ) -> None: + ... + + def __init__(self, default: typing.Any = None, *, doc : typing.Optional[str] = None, constant : bool = False, + readonly : bool = False, allow_None : bool = False, + URL_path : str = USE_OBJECT_NAME, + http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (HTTP_METHODS.GET, HTTP_METHODS.PUT), + state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, + observable : bool = False, change_comparator : typing.Optional[typing.Union[FunctionType, MethodType]] = None, + class_member : bool = False, fget : typing.Optional[typing.Callable] = None, + fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, + fcomparator : typing.Optional[typing.Callable] = None, + deepcopy_default : bool = False, per_instance_descriptor : bool = False, remote : bool = True, + precedence : typing.Optional[float] = None, metadata : typing.Optional[typing.Dict] = None + ) -> None: + super().__init__(default=default, doc=doc, constant=constant, readonly=readonly, allow_None=allow_None, + per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, + class_member=class_member, fget=fget, fset=fset, fdel=fdel, precedence=precedence) + self.db_persist = db_persist + self.db_init = db_init + self.db_commit = db_commit + self.metadata = metadata + self.observable = observable + self._observable_event = None + if remote: + self._remote_info = RemoteResourceInfoValidator( + http_method=http_method, + URL_path=URL_path, + state=state, + isproperty=True + ) + else: + self._remote_info = None + self.fcomparator = fcomparator + + def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: + if slot == 'owner' and self.owner is not None: + if self._remote_info is not None: + if self._remote_info.URL_path == USE_OBJECT_NAME: + self._remote_info.URL_path = '/' + self.name + elif not self._remote_info.URL_path.startswith('/'): + raise ValueError(f"URL_path should start with '/', please add '/' before '{self._remote_info.URL_path}'") + self._remote_info.obj_name = self.name + if self.observable: + self._observable_event = Event(name=f'observable-{self.name}', + URL_path=f'{self._remote_info.URL_path}/change-event') + # In principle the above could be done when setting name itself however to simplify + # we do it with owner. So we should always remember order of __set_name__ -> 1) attrib_name, + # 2) name and then 3) owner + super()._post_slot_set(slot, old, value) + + def _post_value_set(self, obj, value : typing.Any) -> None: + if (self.db_persist or self.db_commit) and hasattr(obj, 'db_engine'): + # from .thing import Thing + # assert isinstance(obj, Thing), f"database property {self.name} bound to a non Thing, currently not supported" + # uncomment for type definitions + obj.db_engine.set_property(self, value) + if self.observable and self._observable_event is not None: + old_value = obj.__dict__.get(self._internal_name, NotImplemented) + obj.__dict__[f'{self._internal_name}_old_value'] = value + if self.fcomparator: + if self.fcomparator(old_value, value): + self._observable_event.push(value) + elif old_value != value: + self._observable_event.push(value) + return super()._post_value_set(obj, value) + + def comparator(self, func : typing.Callable) -> typing.Callable: + """ + Register a getter method by using this as a decorator. + """ + self.fcomparator = func + return func + + + +__property_info__ = [ + 'allow_None' , 'class_member', 'db_init', 'db_persist', + 'db_commit', 'deepcopy_default', 'per_instance_descriptor', + 'default', 'doc', 'constant', + 'metadata', 'name', 'readonly' + # 'scada_info', 'property_type' # descriptor related info is also necessary + ] + + +class ClassProperties(ClassParameters): + """ + Object that holds the namespace and implementation of Parameterized methods as well as any state that is not + in __slots__ or the Properties themselves. + Exists at metaclass level (instantiated by the metaclass). Contains state specific to the class. + """ + + @property + def db_persisting_objects(self): + try: + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params') + except AttributeError: + paramdict = self.remote_objects + db_persisting_remote_params = {} + for name, desc in paramdict.items(): + if desc.db_persist: + db_persisting_remote_params[name] = desc + setattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params', db_persisting_remote_params) + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params') + + @property + def db_init_objects(self) -> typing.Dict[str, Property]: + try: + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params') + except AttributeError: + paramdict = self.remote_objects + init_load_params = {} + for name, desc in paramdict.items(): + if desc.db_init or desc.db_persist: + init_load_params[name] = desc + setattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params', init_load_params) + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params') + + @property + def remote_objects(self) -> typing.Dict[str, Property]: + try: + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params') + except AttributeError: + paramdict = super().descriptors + remote_params = {} + for name, desc in paramdict.items(): + if isinstance(desc, Property): + remote_params[name] = desc + setattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params', remote_params) + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params') + + def webgui_info(self, for_remote_params : typing.Union[Property, typing.Dict[str, Property], None] = None): + info = {} + if isinstance(for_remote_params, dict): + objects = for_remote_params + elif isinstance(for_remote_params, Property): + objects = { for_remote_params.name : for_remote_params } + else: + objects = self.remote_objects + for param in objects.values(): + state = param.__getstate__() + info[param.name] = dict( + remote_info = state.get("_remote_info", None).to_dataclass(), + type = param.__class__.__name__, + owner = param.owner.__name__ + ) + for field in __property_info__: + info[param.name][field] = state.get(field, None) + return info + + @property + def visualization_parameters(self): + from ..webdashboard.visualization_parameters import VisualizationParameter + try: + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params') + except AttributeError: + paramdict = super().descriptors + visual_params = {} + for name, desc in paramdict.items(): + if isinstance(desc, VisualizationParameter): + visual_params[name] = desc + setattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params', visual_params) + return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params') + + + +__all__ = [ + Property.__name__ +] \ No newline at end of file diff --git a/hololinked/server/proxy_client.py b/hololinked/server/proxy_client.py deleted file mode 100644 index d18560c..0000000 --- a/hololinked/server/proxy_client.py +++ /dev/null @@ -1,579 +0,0 @@ -from typing import Any, Callable, Any, Tuple, Type, Callable -from threading import get_ident -import asyncio -import typing - -from .zmq_message_brokers import SyncZMQClient -from .utils import current_datetime_ms_str -from .constants import SERIALIZABLE_WRAPPER_ASSIGNMENTS, FUNC - - - -def addMethod(cls : object, method : Callable, func_info : Tuple[Any] ): - for index, dunder in enumerate(SERIALIZABLE_WRAPPER_ASSIGNMENTS, 2): - if dunder == '__qualname__': - func_infor = '{}.{}'.format(cls.__class__.__name__, func_info[index].split('.')[1]) - else: - func_infor = func_info[index] - setattr(method, dunder, func_infor) - cls.__setattr__(method.__name__, method) - - - -class ProxyClient: - - __own_attrs__ = frozenset([ - '_client', '_server_methods', '_server_attrs', '_server_oneway', '_max_retries', - '_timeout', '_owner_thread', '_get_meta_data', 'instance_name', '__annotations__' - ]) - - def __init__(self, instance_name : str, **kwargs) -> None: - self.instance_name = instance_name - # compose ZMQ client in Proxy client to give the idea that all sending and receiving is actually - # done by the ZMQ client and not by the Proxy client directly - self._client = SyncZMQClient(instance_name, instance_name+current_datetime_ms_str(), **kwargs) - - self._server_methods = set() # all methods of the remote object, gotten from meta-data - self._server_attrs = set() # attributes of the remote object, gotten from meta-data - self._server_oneway = set() # oneway-methods of the remote object, gotten from meta-data - # self._pyroSeq = 0 # message sequence number - # self._pyroRawWireResponse = False # internal switch to enable wire level responses - # self._pyroHandshake = "hello" # the data object that should be sent in the initial connection handshake message - self._max_retries = kwargs.get("max_retires", 3) - self._timeout = kwargs.get("timeout", 3) - self._owner_thread = get_ident() # the thread that owns this proxy - self._get_meta_data() - # if config.SERIALIZER not in serializers.serializers: - # raise ValueError("unknown serializer configured") - # # note: we're not clearing the client annotations dict here. - # that is because otherwise it will be wiped if a new proxy is needed to connect PYRONAME uris. - # clearing the response annotations is okay. - # current_context.response_annotations = {} - # if connected_socket: - # self.__pyroCreateConnection(False, connected_socket) - - def __del__(self): - self.exit() - - def __getattribute__(self, __name: str) -> Any: - if __name in ProxyClient.__own_attrs__: # self.__own_attrs__ will likely cause an infinite recursion - return super(ProxyClient, self).__getattribute__(__name) - if __name in self._server_attrs: - return super(ProxyClient, self).__getattribute__(__name)() - # elif __name in self._server_methods: - # # client side check if the requested attr actually exists - # return - return super().__getattribute__(__name) - - def __setattr__(self, __name : str, __value : Any): - if __name in ProxyClient.__own_attrs__ or __name in self._server_methods: # self.__own_attrs__ will likely cause an infinite recursion - return super(ProxyClient, self).__setattr__(__name, __value) - elif __name in self._server_attrs: - return _RemoteAttribute(self._client, __name, self._max_retries) - raise AttributeError("Cannot set foreign attribute to ProxyClient class for {}".format(self.instance_name)) - # remote attribute - # client side validation if the requested attr actually exists - # if __name in ProxyClient.__own_attrs__: - # return super().__setattr__(__name, __value) # one of the special pyro attributes - # # get metadata if it's not there yet - # if not self._pyroMethods and not self._pyroAttrs: - # self._pyroGetMetadata() - # raise AttributeError("remote object '%s' has no exposed attribute '%s'" % (self._pyroUri, name)) - - def __repr__(self): - if self._pyroConnection: - connected = "connected " + self._pyroConnection.family() - else: - connected = "not connected" - return "<%s.%s at 0x%x; %s; for %s; owner %s>" % (self.__class__.__module__, self.__class__.__name__, - id(self), connected, self._pyroUri, self.__pyroOwnerThread) - - def invoke(self, method : str, oneway : bool = False, noblock : bool = False, **kwargs): - pass - - async def async_invoke(self, method : str, oneway : bool = False, noblock : bool = False, **kwargs): - pass - - def set_parameter(self, parameter : str, value : typing.Any, oneway : bool, noblock : bool = False): - pass - - async def async_set_parameters(self, oneway : bool = False, noblock : bool = False, **parameters): - pass - - def subscribe_event(self, event_name : str): - pass - - def unsubscribe_event(self, event_name : str): - pass - - # def __getstate__(self): - # # make sure a tuple of just primitive types are used to allow for proper serialization - # return str(self._pyroUri), tuple(self._pyroOneway), tuple(self._pyroMethods), \ - # tuple(self._pyroAttrs), self._pyroHandshake, self._pyroSerializer - - # def __setstate__(self, state): - # self._pyroUri = core.URI(state[0]) - # self._pyroOneway = set(state[1]) - # self._pyroMethods = set(state[2]) - # self._pyroAttrs = set(state[3]) - # self._pyroHandshake = state[4] - # self._pyroSerializer = state[5] - # self.__pyroTimeout = config.COMMTIMEOUT - # self._pyroMaxRetries = config.MAX_RETRIES - # self._pyroConnection = None - # self._pyroLocalSocket = None - # self._pyroSeq = 0 - # self._pyroRawWireResponse = False - # self.__pyroOwnerThread = get_ident() - - # def __copy__(self): - # p = object.__new__(type(self)) - # p.__setstate__(self.__getstate__()) - # p._pyroTimeout = self._pyroTimeout - # p._pyroRawWireResponse = self._pyroRawWireResponse - # p._pyroMaxRetries = self._pyroMaxRetries - # return p - - def __enter__(self): - return self - - def __exit__(self, exc_type, exc_value, traceback): - self.exit() - - # def __eq__(self, other): - # if other is self: - # return True - # return isinstance(other, Proxy) and other._pyroUri == self._pyroUri - - # def __ne__(self, other): - # if other and isinstance(other, Proxy): - # return other._pyroUri != self._pyroUri - # return True - - # def __hash__(self): - # return hash(self._pyroUri) - - # def __dir__(self): - # result = dir(self.__class__) + list(self.__dict__.keys()) - # return sorted(set(result) | self._pyroMethods | self._pyroAttrs) - - # # When special methods are invoked via special syntax (e.g. obj[index] calls - # # obj.__getitem__(index)), the special methods are not looked up via __getattr__ - # # for efficiency reasons; instead, their presence is checked directly. - # # Thus we need to define them here to force (remote) lookup through __getitem__. - # def __bool__(self): return True - # def __len__(self): return self.__getattr__('__len__')() - # def __getitem__(self, index): return self.__getattr__('__getitem__')(index) - # def __setitem__(self, index, val): return self.__getattr__('__setitem__')(index, val) - # def __delitem__(self, index): return self.__getattr__('__delitem__')(index) - - # def __iter__(self): - # try: - # # use remote iterator if it exists - # yield from self.__getattr__('__iter__')() - # except AttributeError: - # # fallback to indexed based iteration - # try: - # yield from (self[index] for index in range(sys.maxsize)) - # except (StopIteration, IndexError): - # return - - def exit(self): - """release the connection to the pyro daemon""" - self.__check_owner_thread() - self._client.exit() - - # def _pyroBind(self): - # """ - # Bind this proxy to the exact object from the uri. That means that the proxy's uri - # will be updated with a direct PYRO uri, if it isn't one yet. - # If the proxy is already bound, it will not bind again. - # """ - # return self.__pyroCreateConnection(True) - - # def __pyroGetTimeout(self): - # return self.__pyroTimeout - - # def __pyroSetTimeout(self, timeout): - # self.__pyroTimeout = timeout - # if self._pyroConnection is not None: - # self._pyroConnection.timeout = timeout - - # _pyroTimeout = property(__pyroGetTimeout, __pyroSetTimeout, doc=""" - # The timeout in seconds for calls on this proxy. Defaults to ``None``. - # If the timeout expires before the remote method call returns, - # Pyro will raise a :exc:`Pyro5.errors.TimeoutError`""") - - # async def _invoke(self, instruction, flags=0, objectId=None): - # """perform the remote method call communication""" - # self.__check_owner() - # current_context.response_annotations = {} - # if self._pyroConnection is None: - # self.__pyroCreateConnection() - # serializer = serializers.serializers[self._pyroSerializer or config.SERIALIZER] - # objectId = objectId or self._pyroConnection.objectId - # annotations = current_context.annotations - # if vargs and isinstance(vargs[0], SerializedBlob): - # # special serialization of a 'blob' that stays serialized - # data, flags = self.__serializeBlobArgs(vargs, kwargs, annotations, flags, objectId, methodname, serializer) - # else: - # # normal serialization of the remote call - # data = serializer.dumpsCall(objectId, methodname, vargs, kwargs) - # if methodname in self._pyroOneway: - # flags |= protocol.FLAGS_ONEWAY - # self._pyroSeq = (self._pyroSeq + 1) & 0xffff - # msg = protocol.SendingMessage(protocol.MSG_INVOKE, flags, self._pyroSeq, serializer.serializer_id, data, annotations=annotations) - # if config.LOGWIRE: - # protocol.log_wiredata(log, "proxy wiredata sending", msg) - # try: - # self._pyroConnection.send(msg.data) - # del msg # invite GC to collect the object, don't wait for out-of-scope - # if flags & protocol.FLAGS_ONEWAY: - # return None # oneway call, no response data - # else: - # msg = protocol.recv_stub(self._pyroConnection, [protocol.MSG_RESULT]) - # if config.LOGWIRE: - # protocol.log_wiredata(log, "proxy wiredata received", msg) - # self.__pyroCheckSequence(msg.seq) - # if msg.serializer_id != serializer.serializer_id: - # error = "invalid serializer in response: %d" % msg.serializer_id - # log.error(error) - # raise errors.SerializeError(error) - # if msg.annotations: - # current_context.response_annotations = msg.annotations - # if self._pyroRawWireResponse: - # return msg - # data = serializer.loads(msg.data) - # if msg.flags & protocol.FLAGS_ITEMSTREAMRESULT: - # streamId = bytes(msg.annotations.get("STRM", b"")).decode() - # if not streamId: - # raise errors.ProtocolError("result of call is an iterator, but the server is not configured to allow streaming") - # return _StreamResultIterator(streamId, self) - # if msg.flags & protocol.FLAGS_EXCEPTION: - # raise data # if you see this in your traceback, you should probably inspect the remote traceback as well - # else: - # return data - # except (errors.CommunicationError, KeyboardInterrupt): - # # Communication error during read. To avoid corrupt transfers, we close the connection. - # # Otherwise we might receive the previous reply as a result of a new method call! - # # Special case for keyboardinterrupt: people pressing ^C to abort the client - # # may be catching the keyboardinterrupt in their code. We should probably be on the - # # safe side and release the proxy connection in this case too, because they might - # # be reusing the proxy object after catching the exception... - # self._pyroRelease() - # raise - - # def __pyroCheckSequence(self, seq): - # if seq != self._pyroSeq: - # err = "invoke: reply sequence out of sync, got %d expected %d" % (seq, self._pyroSeq) - # log.error(err) - # raise errors.ProtocolError(err) - - # def __pyroCreateConnection(self, replaceUri=False, connected_socket=None): - # """ - # Connects this proxy to the remote Pyro daemon. Does connection handshake. - # Returns true if a new connection was made, false if an existing one was already present. - # """ - # def connect_and_handshake(conn): - # try: - # if self._pyroConnection is not None: - # return False # already connected - # if config.SSL: - # sslContext = socketutil.get_ssl_context(clientcert=config.SSL_CLIENTCERT, - # clientkey=config.SSL_CLIENTKEY, - # keypassword=config.SSL_CLIENTKEYPASSWD, - # cacerts=config.SSL_CACERTS) - # else: - # sslContext = None - # sock = socketutil.create_socket(connect=connect_location, - # reuseaddr=config.SOCK_REUSE, - # timeout=self.__pyroTimeout, - # nodelay=config.SOCK_NODELAY, - # sslContext=sslContext) - # conn = socketutil.SocketConnection(sock, uri.object) - # # Do handshake. - # serializer = serializers.serializers[self._pyroSerializer or config.SERIALIZER] - # data = {"handshake": self._pyroHandshake, "object": uri.object} - # data = serializer.dumps(data) - # msg = protocol.SendingMessage(protocol.MSG_CONNECT, 0, self._pyroSeq, serializer.serializer_id, - # data, annotations=current_context.annotations) - # if config.LOGWIRE: - # protocol.log_wiredata(log, "proxy connect sending", msg) - # conn.send(msg.data) - # msg = protocol.recv_stub(conn, [protocol.MSG_CONNECTOK, protocol.MSG_CONNECTFAIL]) - # if config.LOGWIRE: - # protocol.log_wiredata(log, "proxy connect response received", msg) - # except Exception as x: - # if conn: - # conn.close() - # err = "cannot connect to %s: %s" % (connect_location, x) - # log.error(err) - # if isinstance(x, errors.CommunicationError): - # raise - # else: - # raise errors.CommunicationError(err) from x - # else: - # handshake_response = "?" - # if msg.data: - # serializer = serializers.serializers_by_id[msg.serializer_id] - # handshake_response = serializer.loads(msg.data) - # if msg.type == protocol.MSG_CONNECTFAIL: - # error = "connection to %s rejected: %s" % (connect_location, handshake_response) - # conn.close() - # log.error(error) - # raise errors.CommunicationError(error) - # elif msg.type == protocol.MSG_CONNECTOK: - # self.__processMetadata(handshake_response["meta"]) - # handshake_response = handshake_response["handshake"] - # self._pyroConnection = conn - # self._pyroLocalSocket = conn.sock.getsockname() - # if replaceUri: - # self._pyroUri = uri - # self._pyroValidateHandshake(handshake_response) - # log.debug("connected to %s - %s - %s", self._pyroUri, conn.family(), "SSL" if sslContext else "unencrypted") - # if msg.annotations: - # current_context.response_annotations = msg.annotations - # else: - # conn.close() - # err = "cannot connect to %s: invalid msg type %d received" % (connect_location, msg.type) - # log.error(err) - # raise errors.ProtocolError(err) - - # self.__check_owner() - # if self._pyroConnection is not None: - # return False # already connected - # uri = core.resolve(self._pyroUri) - # # socket connection (normal or Unix domain socket) - # conn = None - # log.debug("connecting to %s", uri) - # connect_location = uri.sockname or (uri.host, uri.port) - # if connected_socket: - # self._pyroConnection = socketutil.SocketConnection(connected_socket, uri.object, True) - # self._pyroLocalSocket = connected_socket.getsockname() - # else: - # connect_and_handshake(conn) - # # obtain metadata if this feature is enabled, and the metadata is not known yet - # if not self._pyroMethods and not self._pyroAttrs: - # self._pyroGetMetadata(uri.object) - # return True - - def _get_meta_data(self, objectId=None, known_metadata=None): - """ - Get metadata from server (methods, attrs, oneway, ...) and remember them in some attributes of the proxy. - Usually this will already be known due to the default behavior of the connect handshake, where the - connect response also includes the metadata. - """ - func = _RemoteMethod(self._client, '/proxy', 3) - reply = func()[4][self.instance_name]["returnValue"] - for name, data in reply.items(): - if data[1] == FUNC: - self._server_methods.add(data[3]) - addMethod(self, _RemoteMethod(self._client, data[0], self._max_retries), data) - - # objectId = objectId or self._pyroUri.object - # log.debug("getting metadata for object %s", objectId) - # if self._pyroConnection is None and not known_metadata: - # try: - # self.__pyroCreateConnection() - # except errors.PyroError: - # log.error("problem getting metadata: cannot connect") - # raise - # if self._pyroMethods or self._pyroAttrs: - # return # metadata has already been retrieved as part of creating the connection - # try: - # # invoke the get_metadata method on the daemon - # result = known_metadata or self._pyroInvoke("get_metadata", [objectId], {}, objectId=core.DAEMON_NAME) - # self.__processMetadata(result) - # except errors.PyroError: - # log.exception("problem getting metadata") - # raise - - # def __processMetadata(self, metadata): - # if not metadata: - # return - # self._pyroOneway = set(metadata["oneway"]) - # self._pyroMethods = set(metadata["methods"]) - # self._pyroAttrs = set(metadata["attrs"]) - # if log.isEnabledFor(logging.DEBUG): - # log.debug("from meta: methods=%s, oneway methods=%s, attributes=%s", - # sorted(self._pyroMethods), sorted(self._pyroOneway), sorted(self._pyroAttrs)) - # if not self._pyroMethods and not self._pyroAttrs: - # raise errors.PyroError("remote object doesn't expose any methods or attributes. Did you forget setting @expose on them?") - - # def _pyroReconnect(self, tries=100000000): - # """ - # (Re)connect the proxy to the daemon containing the pyro object which the proxy is for. - # In contrast to the _pyroBind method, this one first releases the connection (if the proxy is still connected) - # and retries making a new connection until it succeeds or the given amount of tries ran out. - # """ - # self._pyroRelease() - # while tries: - # try: - # self.__pyroCreateConnection() - # return - # except errors.CommunicationError: - # tries -= 1 - # if tries: - # time.sleep(2) - # msg = "failed to reconnect" - # log.error(msg) - # raise errors.ConnectionClosedError(msg) - - # def _pyroInvokeBatch(self, calls, oneway=False): - # flags = protocol.FLAGS_BATCH - # if oneway: - # flags |= protocol.FLAGS_ONEWAY - # return self._pyroInvoke("", calls, None, flags) - - # def _pyroValidateHandshake(self, response): - # """ - # Process and validate the initial connection handshake response data received from the daemon. - # Simply return without error if everything is ok. - # Raise an exception if something is wrong and the connection should not be made. - # """ - # return - - # def _pyroClaimOwnership(self): - # """ - # The current thread claims the ownership of this proxy from another thread. - # Any existing connection will remain active! - # """ - # if get_ident() != self.__pyroOwnerThread: - # # if self._pyroConnection is not None: - # # self._pyroConnection.close() - # # self._pyroConnection = None - # self.__pyroOwnerThread = get_ident() - - # def __serializeBlobArgs(self, vargs, kwargs, annotations, flags, objectId, methodname, serializer): - # """ - # Special handling of a "blob" argument that has to stay serialized until explicitly deserialized in client code. - # This makes efficient, transparent gateways or dispatchers and such possible: - # they don't have to de/reserialize the message and are independent from the serialized class definitions. - # Annotations are passed in because some blob metadata is added. They're not part of the blob itself. - # """ - # if len(vargs) > 1 or kwargs: - # raise errors.SerializeError("if SerializedBlob is used, it must be the only argument") - # blob = vargs[0] - # flags |= protocol.FLAGS_KEEPSERIALIZED - # # Pass the objectId and methodname separately in an annotation because currently, - # # they are embedded inside the serialized message data. And we're not deserializing that, - # # so we have to have another means of knowing the object and method it is meant for... - # # A better solution is perhaps to split the actual remote method arguments from the - # # control data (object + methodname) but that requires a major protocol change. - # # The code below is not as nice but it works without any protocol change and doesn't - # # require a hack either - so it's actually not bad like this. - # import marshal - # annotations["BLBI"] = marshal.dumps((blob.info, objectId, methodname)) - # if blob._contains_blob: - # # directly pass through the already serialized msg data from within the blob - # protocol_msg = blob._data - # return protocol_msg.data, flags - # else: - # # replaces SerializedBlob argument with the data to be serialized - # return serializer.dumpsCall(objectId, methodname, blob._data, kwargs), flags - - def __check_owner_thread(self): - if get_ident() != self._owner_thread: - raise RuntimeError("the calling thread is not the owner of this proxy, \ - create a new proxy in this thread or transfer ownership.") - - -class _RemoteMethod(object): - """method call abstraction""" - - def __init__(self, client : SyncZMQClient, instruction : str, max_retries : int) -> None: - self._client = client - self._instruction = instruction - self._max_retries = max_retries - self._loop = asyncio.get_event_loop() - - def last_return_value(self): - return self._last_return_value - - def __call__(self, *args, **kwargs) -> Any: - for attempt in range(self._max_retries + 1): - try: - self._last_return_value = self._client.execute_function(self._instruction, kwargs) - return self._last_return_value - except Exception as E: - print(E) - - - - -class _RemoteAttribute(object): - """method call abstraction""" - - def __init__(self, client, instruction, max_retries): - self._client = client - self._instruction = instruction - self._max_retries = max_retries - - def last_value(self): - return self._last_value - - def __call__(self, *args, **kwargs) -> Any: - for attempt in range(self._max_retries + 1): - try: - self._last_value = self._client.execute_function(self._instruction, kwargs) - return self._last_value - except Exception as E: - print(E) - - - -class _StreamResultIterator(object): - """ - Pyro returns this as a result of a remote call which returns an iterator or generator. - It is a normal iterable and produces elements on demand from the remote iterator. - You can simply use it in for loops, list comprehensions etc. - """ - def __init__(self, streamId, proxy): - self.streamId = streamId - self.proxy = proxy - self.pyroseq = proxy._pyroSeq - - def __iter__(self): - return self - - def __next__(self): - if self.proxy is None: - raise StopIteration - if self.proxy._pyroConnection is None: - raise errors.ConnectionClosedError("the proxy for this stream result has been closed") - self.pyroseq += 1 - try: - return self.proxy._pyroInvoke("get_next_stream_item", [self.streamId], {}, objectId=core.DAEMON_NAME) - except (StopIteration, GeneratorExit): - # when the iterator is exhausted, the proxy is removed to avoid unneeded close_stream calls later - # (the server has closed its part of the stream by itself already) - self.proxy = None - raise - - def __del__(self): - try: - self.close() - except Exception: - pass - - def close(self): - if self.proxy and self.proxy._pyroConnection is not None: - if self.pyroseq == self.proxy._pyroSeq: - # we're still in sync, it's okay to use the same proxy to close this stream - self.proxy._pyroInvoke("close_stream", [self.streamId], {}, - flags=protocol.FLAGS_ONEWAY, objectId=core.DAEMON_NAME) - else: - # The proxy's sequence number has diverged. - # One of the reasons this can happen is because this call is being done from python's GC where - # it decides to gc old iterator objects *during a new call on the proxy*. - # If we use the same proxy and do a call in between, the other call on the proxy will get an out of sync seq and crash! - # We create a temporary second proxy to call close_stream on. This is inefficient, but avoids the problem. - with contextlib.suppress(errors.CommunicationError): - with self.proxy.__copy__() as closingProxy: - closingProxy._pyroInvoke("close_stream", [self.streamId], {}, - flags=protocol.FLAGS_ONEWAY, objectId=core.DAEMON_NAME) - self.proxy = None - - -__all__ = ['ProxyClient'] - diff --git a/hololinked/server/remote_object.py b/hololinked/server/remote_object.py deleted file mode 100644 index 6935b63..0000000 --- a/hololinked/server/remote_object.py +++ /dev/null @@ -1,1079 +0,0 @@ -import asyncio -from collections import deque -import json -import logging -import inspect -import os -import threading -import time -import typing -import datetime -from enum import EnumMeta, Enum -from dataclasses import asdict, dataclass -from sqlalchemy import (Integer as DBInteger, String as DBString, JSON as DB_JSON, LargeBinary as DBBinary) -from sqlalchemy import select -from sqlalchemy.orm import Mapped, mapped_column, DeclarativeBase, MappedAsDataclass - -from ..param.parameterized import Parameterized, ParameterizedMetaclass -from ..param.parameters import (String, ClassSelector, TupleSelector, TypedDict, Boolean, - Selector, TypedKeyMappingsConstrainedDict) - -from .constants import (EVENT, GET, IMAGE_STREAM, JSONSerializable, instance_name_regex, CallableType, CALLABLE, - ATTRIBUTE, READ, WRITE, log_levels, POST, ZMQ_PROTOCOLS, FILE) -from .serializers import * -from .exceptions import BreakInnerLoop -from .decorators import get, post, remote_method -from .data_classes import (GUIResources, HTTPServerEventData, HTTPServerResourceData, ProxyResourceData, - HTTPServerResourceData, FileServerData, ScadaInfoData, - ScadaInfoValidator) -from .api_platform_utils import postman_item, postman_itemgroup -from .database import BaseAsyncDB, BaseSyncDB -from .utils import create_default_logger, get_signature, wrap_text -from .api_platform_utils import * -from .remote_parameter import FileServer, PlotlyFigure, ReactApp, RemoteParameter, RemoteClassParameters, Image -from .remote_parameters import (Boolean as RemoteBoolean, ClassSelector as RemoteClassSelector, - Integer as RemoteInteger ) -from .zmq_message_brokers import ServerTypes, EventPublisher, AsyncPollingZMQServer, Event - - - -class StateMachine: - """ - A container class for state machine related logic, this is intended to be used by the - RemoteObject and its descendents. - - Args: - initial_state (str): initial state of machine - states (Enum): Enum type holding enumeration of states - on_enter (Dict[str, Union[Callable, RemoteParameter]]): callbacks to be invoked when a certain state is entered. - It is to be specified as a dictionary with the states being the keys - on_exit (Dict[str, Union[Callable, RemoteParameter]]): callbacks to be invoked when a certain state is exited. - It is to be specified as a dictionary with the states being the keys - - Attributes: - exists (bool): internally computed, True if states and initial_states are valid - """ - initial_state = RemoteClassSelector(default=None, allow_None=True, constant=True, class_=(Enum, str)) - exists = RemoteBoolean(default=False) - states = RemoteClassSelector(default=None, allow_None=True, constant=True, class_=(EnumMeta, tuple, list)) - on_enter = TypedDict(default=None, allow_None=True, key_type=str) - on_exit = TypedDict(default=None, allow_None=True, key_type=str) - machine = TypedDict(default=None, allow_None=True, key_type=str, item_type=(list, tuple)) - - def __init__(self, states : typing.Union[EnumMeta, typing.List[str], typing.Tuple[str]], *, - initial_state : typing.Union[Enum, str], - on_enter : typing.Dict[str, typing.Union[typing.List[typing.Callable], typing.Callable]] = {}, - on_exit : typing.Dict[str, typing.Union[typing.List[typing.Callable], typing.Callable]] = {}, - push_state_change_event : bool = False, - **machine : typing.Iterable[typing.Union[typing.Callable, RemoteParameter]]) -> None: - self.on_enter = on_enter - self.on_exit = on_exit - # None cannot be passed in, but constant is necessary. - self.states = states - self.initial_state = initial_state - self.machine = machine - self.push_state_change_event = push_state_change_event - if push_state_change_event: - self.state_change_event = Event('state-change') - - def _prepare(self, owner : 'RemoteObject') -> None: - if self.states is None and self.initial_state is None: - self.exists = False - self._state = None - return - elif self.initial_state not in self.states: # type: ignore - raise AttributeError("specified initial state {} not in Enum of states {}".format(self.initial_state, - self.states)) - - self._state = self.initial_state - self.owner = owner - owner_parameters = owner.parameters.descriptors.values() - owner_methods = [obj[0] for obj in inspect.getmembers(owner, inspect.ismethod)] - - if isinstance(self.states, list): - self.states = tuple(self.states) - if hasattr(self, 'state_change_event'): - self.state_change_event.publisher = owner.event_publisher - - # first validate machine - for state, objects in self.machine.items(): - if state in self: - for resource in objects: - if hasattr(resource, 'scada_info'): - assert isinstance(resource.scada_info, ScadaInfoValidator) # type: ignore - if resource.scada_info.iscallable and resource.scada_info.obj_name not in owner_methods: # type: ignore - raise AttributeError("Given object {} for state machine does not belong to class {}".format( - resource, owner)) - if resource.scada_info.isparameter and resource not in owner_parameters: # type: ignore - raise AttributeError("Given object {} - {} for state machine does not belong to class {}".format( - resource.name, resource, owner)) - if resource.scada_info.state is None: # type: ignore - resource.scada_info.state = self._machine_compliant_state(state) # type: ignore - else: - resource.scada_info.state = resource.scada_info.state + (self._machine_compliant_state(state), ) # type: ignore - else: - raise AttributeError(wrap_text(f"""Object {resource} not made remotely accessible. - Use state machine with remote parameters and remote methods only""")) - else: - raise AttributeError("Given state {} not in states Enum {}".format(state, self.states.__members__)) - - # then the callbacks - for state, objects in self.on_enter.items(): - if isinstance(objects, list): - self.on_enter[state] = tuple(objects) # type: ignore - elif not isinstance(objects, (list, tuple)): - self.on_enter[state] = (objects, ) # type: ignore - for obj in self.on_enter[state]: # type: ignore - if not isinstance(obj, CallableType): - raise TypeError(f"on_enter accept only methods. Given type {type(obj)}.") - - for state, objects in self.on_exit.items(): - if isinstance(objects, list): - self.on_exit[state] = tuple(objects) # type: ignore - elif not isinstance(objects, (list, tuple)): - self.on_exit[state] = (objects, ) # type: ignore - for obj in self.on_exit[state]: # type: ignore - if not isinstance(obj, CallableType): - raise TypeError(f"on_enter accept only methods. Given type {type(obj)}.") - self.exists = True - - def __contains__(self, state : typing.Union[str, Enum]): - if isinstance(self.states, EnumMeta) and state not in self.states.__members__ and state not in self.states: # type: ignore - return False - elif isinstance(self.states, (tuple, list)) and state not in self.states: - return False - return True - - def _machine_compliant_state(self, state) -> typing.Union[Enum, str]: - if isinstance(self.states, EnumMeta): - return self.states.__members__[state] # type: ignore - return state - - def get_state(self) -> typing.Union[str, Enum, None]: - """return the current state. one can also access the property `current state`. - Returns: - str: current state - """ - return self._state - - def set_state(self, value, push_event : bool = True, skip_callbacks : bool = False) -> None: - """ set state of state machine. Also triggers state change callbacks - if any. One can also set using '=' operator of `current_state` property. - """ - if value in self.states: - previous_state = self._state - self._state = value - if push_event and self.push_state_change_event: - self.state_change_event.push({self.owner.instance_name : value}) - if isinstance(previous_state, Enum): - previous_state = previous_state.name - if previous_state in self.on_exit: - for func in self.on_exit[previous_state]: # type: ignore - func(self.owner) - if isinstance(value, Enum): - value = value.name - if value in self.on_enter: - for func in self.on_enter[value]: # type: ignore - func(self.owner) - else: - raise ValueError(wrap_text("""given state '{}' not in set of allowed states : {}. - """.format(value, self.states) - )) - - current_state = property(get_state, set_state, None, doc = """ - read and write current state of the state machine""") - - def query(self, info : typing.Union[str, typing.List[str]] ) -> typing.Any: - raise NotImplementedError("arbitrary quering of {} not possible".format(self.__class__.__name__)) - - - -class RemoteObjectDB(BaseSyncDB): - - class TableBase(DeclarativeBase): - pass - - class RemoteObjectInfo(MappedAsDataclass, TableBase): - __tablename__ = "remote_objects" - - instance_name : Mapped[str] = mapped_column(DBString, primary_key = True) - class_name : Mapped[str] = mapped_column(DBString) - http_server : Mapped[str] = mapped_column(DBString) - script : Mapped[str] = mapped_column(DBString) - args : Mapped[JSONSerializable] = mapped_column(DB_JSON) - kwargs : Mapped[JSONSerializable] = mapped_column(DB_JSON) - eventloop_name : Mapped[str] = mapped_column(DBString) - level : Mapped[int] = mapped_column(DBInteger) - level_type : Mapped[str] = mapped_column(DBString) - - def json(self): - return asdict(self) - - class Parameter(TableBase): - __tablename__ = "parameters" - - id : Mapped[int] = mapped_column(DBInteger, primary_key = True, autoincrement = True) - instance_name : Mapped[str] = mapped_column(DBString) - name : Mapped[str] = mapped_column(DBString) - serialized_value : Mapped[bytes] = mapped_column(DBBinary) - - @dataclass - class ParameterData: - name : str - value : typing.Any - - def __init__(self, instance_name : str, serializer : BaseSerializer, - config_file: typing.Union[str, None] = None ) -> None: - super().__init__(database = 'scadapyserver', serializer = serializer, config_file = config_file) - self.instance_name = instance_name - - def fetch_own_info(self) -> RemoteObjectInfo: - with self.sync_session() as session: - stmt = select(self.RemoteObjectInfo).filter_by(instance_name = self.instance_name) - data = session.execute(stmt) - data = data.scalars().all() - if len(data) == 0: - return None - return data[0] - - def read_all_parameters(self, deserialized : bool = True) -> typing.Sequence[typing.Union[Parameter, - ParameterData]]: - with self.sync_session() as session: - stmt = select(self.Parameter).filter_by(instance_name = self.instance_name) - data = session.execute(stmt) - existing_params = data.scalars().all() - if not deserialized: - return existing_params - else: - params_data = [] - for param in existing_params: - params_data.append(self.ParameterData( - name = param.name, - value = self.serializer.loads(param.serialized_value) - )) - return params_data - - def edit_parameter(self, parameter : RemoteParameter, value : typing.Any) -> None: - with self.sync_session() as session: - stmt = select(self.Parameter).filter_by(instance_name = self.instance_name, name = parameter.name) - data = session.execute(stmt) - param = data.scalar() - param.serialized_value = self.serializer.dumps(value) - session.commit() - - def create_missing_db_parameters(self, parameters : typing.Dict[str, RemoteParameter]) -> None: - with self.sync_session() as session: - existing_params = self.read_all_parameters() - existing_names = [p.name for p in existing_params] - for name, new_param in parameters.items(): - if name not in existing_names: - param = self.Parameter( - instance_name = self.instance_name, - name = new_param.name, - serialized_value = self.serializer.dumps(new_param.default) - ) - session.add(param) - session.commit() - - - -ConfigInfo = Enum('LevelTypes','USER_MANAGED PRIMARY_HOST_WIDE PC_HOST_WIDE') - - -class RemoteObjectMetaclass(ParameterizedMetaclass): - - @classmethod - def __prepare__(cls, name, bases): - return TypedKeyMappingsConstrainedDict({}, - type_mapping = dict( - state_machine = (StateMachine, type(None)), - instance_name = String, - log_level = Selector, - logger = ClassSelector, - logfile = String, - db_config_file = String, - object_info = RemoteParameter, # it should not be set by the user - ), - allow_unspecified_keys = True - ) - - def __new__(cls, __name, __bases, __dict : TypedKeyMappingsConstrainedDict): - return super().__new__(cls, __name, __bases, __dict._inner) - - def __call__(mcls, *args, **kwargs): - instance = super().__call__(*args, **kwargs) - instance.__post_init__() - return instance - - def _create_param_container(mcs, mcs_members : dict) -> None: - mcs._param_container = RemoteClassParameters(mcs, mcs_members) - - @property - def parameters(mcs) -> RemoteClassParameters: - return mcs._param_container - - - -class RemoteSubobject(Parameterized, metaclass=RemoteObjectMetaclass): - """ - Subclass from here for remote capable sub objects composed within remote object instance. Does not support - state machine, logger, serializers, dedicated message brokers etc. - """ - - instance_name = String(default=None, regex=instance_name_regex, constant = True, - doc = """Unique string identifier of the instance used for many operations, - for example - creating zmq socket address, tables in databases, and to identify the instance - in the HTTP Server & scadapy.webdashboard client - - (http(s)://{domain and sub domain}/{instance name}). It is suggested to use - the class name along with a unique name {class name}/{some name}. Instance names must be unique - in your entire system.""") # type: ignore - events = RemoteParameter(readonly=True, URL_path='/events', - doc="returns a dictionary with two fields " ) # type: ignore - httpserver_resources = RemoteParameter(readonly=True, URL_path='/resources/http', - doc="""""" ) # type: ignore - proxy_resources = RemoteParameter(readonly=True, URL_path='/resources/proxy', - doc= """object's resources exposed to ProxyClient, similar to http_resources but differs - in details.""") # type: ignore - - - def __new__(cls, **kwargs): - """ - custom defined __new__ method to assign some important attributes at instance creation time directly instead of - super().__init__(instance_name = val1 , users_own_kw_argument1 = users_val1, ..., users_own_kw_argumentn = users_valn) - method. The lowest child's __init__ is always called first and then the code reaches the __init__ of RemoteObject. - Therefore, when the user passes arguments to his own RemoteObject descendent, they have to again pass some required - information (like instance_name) to the __init__ of super() a second time with proper keywords. - To avoid this hassle, we create this __new__. super().__init__() in a descendent is still not optional though. - """ - obj = super().__new__(cls) - # objects created by us that require no validation but cannot be modified are called _internal_fixed_attributes - obj._internal_fixed_attributes = ['_internal_fixed_attributes', 'instance_resources', '_owner'] - # objects given by user which we need to validate (mostly descriptors) - obj.instance_name = kwargs.get('instance_name', None) - return obj - - def __post_init__(self): - self.instance_name : str - self.httpserver_resources : typing.Dict - self.proxy_resources : typing.Dict - self.events : typing.Dict - self._owner : typing.Optional[typing.Union[RemoteSubobject, RemoteObject]] - self._internal_fixed_attributes : typing.List[str] - - @property - def _event_publisher(self) -> EventPublisher: - try: - return self.event_publisher - except AttributeError: - top_owner = self._owner - while True: - if isinstance(top_owner, RemoteObject): - self.event_publisher = top_owner.event_publisher - return self.event_publisher - elif isinstance(top_owner, RemoteSubobject): - top_owner = top_owner._owner - else: - raise RuntimeError(wrap_text("""Error while finding owner of RemoteSubobject, - RemoteSubobject must be composed only within RemoteObject or RemoteSubobject, - otherwise there can be problems.""")) - - def __setattr__(self, __name: str, __value: typing.Any) -> None: - if __name == '_internal_fixed_attributes' or __name in self._internal_fixed_attributes: - # order of 'or' operation for above 'if' matters - if not hasattr(self, __name): - # allow setting of fixed attributes once - super().__setattr__(__name, __value) - else: - raise AttributeError( - wrap_text( - f""" - Attempted to set {__name} more than once. cannot assign a value to this variable after creation. - """ - )) - else: - super().__setattr__(__name, __value) - - - def _prepare_resources(self): - """ - this function analyses the members of the class which have 'scadapy' variable declared - and extracts information - """ - # The following dict is to be given to the HTTP server - httpserver_resources = dict( - GET = dict(), - POST = dict(), - PUT = dict(), - DELETE = dict(), - PATCH = dict(), - OPTIONS = dict() - ) - # The following dict will be given to the proxy client - proxy_resources = dict() - # The following dict will be used by the event loop - instance_resources : typing.Dict[str, ScadaInfoData] = dict() - # create URL prefix - self.full_URL_path_prefix = f'{self._owner.full_URL_path_prefix}/{self.instance_name}' if self._owner is not None else f'/{self.instance_name}' - - # First add methods and callables - for name, resource in inspect.getmembers(self, inspect.ismethod): - if hasattr(resource, 'scada_info'): - if not isinstance(resource.scada_info, ScadaInfoValidator): # type: ignore - raise TypeError("instance member {} has unknown sub-member 'scada_info' of type {}.".format( - resource, type(resource.scada_info))) # type: ignore - scada_info = resource.scada_info.create_dataclass(obj=resource, bound_obj=self) # type: ignore - # methods are already bound though - fullpath = "{}{}".format(self.full_URL_path_prefix, scada_info.URL_path) - if scada_info.iscallable: - for http_method in scada_info.http_method: - httpserver_resources[http_method][fullpath] = HTTPServerResourceData( - what=CALLABLE, - instance_name=self._owner.instance_name if self._owner is not None else self.instance_name, - fullpath=fullpath, - instruction=fullpath, - http_request_as_argument=scada_info.http_request_as_argument - ) - proxy_resources[fullpath] = ProxyResourceData( - what=CALLABLE, - instruction=fullpath, - module=getattr(resource, '__module__'), - name=getattr(resource, '__name__'), - qualname=getattr(resource, '__qualname__'), - doc=getattr(resource, '__doc__'), - kwdefaults=getattr(resource, '__kwdefaults__'), - defaults=getattr(resource, '__defaults__'), - ) - instance_resources[fullpath] = scada_info - # Other remote objects - for name, resource in inspect.getmembers(self, lambda o : isinstance(o, RemoteSubobject)): - if name == '_owner': - continue - elif isinstance(resource, RemoteSubobject): - resource._owner = self - resource._prepare_instance() - for http_method, resources in resource.httpserver_resources.items(): - httpserver_resources[http_method].update(resources) - proxy_resources.update(resource.proxy_resources) - instance_resources.update(resource.instance_resources) - # Events - for name, resource in inspect.getmembers(self, lambda o : isinstance(o, Event)): - assert isinstance(resource, Event) - resource._owner = self - resource.full_URL_path_prefix = self.full_URL_path_prefix - resource.publisher = self._event_publisher - httpserver_resources[GET]['{}/event{}'.format( - self.full_URL_path_prefix, resource.URL_path)] = HTTPServerEventData( - # event URL_path has '/' prefix - what=EVENT, - event_name=resource.name, - socket_address=self._event_publisher.socket_address - ) - # Parameters - for parameter in self.parameters.descriptors.values(): - if hasattr(parameter, 'scada_info'): - if not isinstance(parameter.scada_info, ScadaInfoValidator): # type: ignore - raise TypeError("instance member {} has unknown sub-member 'scada_info' of type {}.".format( - parameter, type(parameter.scada_info))) # type: ignore - # above condition is just a gaurd in case user does some unpredictable patching activities - scada_info = parameter.scada_info.create_dataclass(obj=parameter, bound_obj=self) # type: ignore - fullpath = "{}{}".format(self.full_URL_path_prefix, scada_info.URL_path) - if scada_info.isparameter: - read_http_method, write_http_method = scada_info.http_method - - httpserver_resources[read_http_method][fullpath] = HTTPServerResourceData( - what=ATTRIBUTE, - instance_name=self._owner.instance_name if self._owner is not None else self.instance_name, - fullpath=fullpath, - instruction=fullpath + '/' + READ - ) - if isinstance(parameter, Image) and parameter.streamable: - parameter.event._owner = self - parameter.event.full_URL_path_prefix = self.full_URL_path_prefix - parameter.event.publisher = self._event_publisher - httpserver_resources[GET]['{}/event{}'.format( - self.full_URL_path_prefix, parameter.event.URL_path)] = HTTPServerEventData( - what=EVENT, - event_name=parameter.event.name, - socket_address=self._event_publisher.socket_address, - ) - httpserver_resources[write_http_method][fullpath] = HTTPServerResourceData( - what=ATTRIBUTE, - instance_name=self._owner.instance_name if self._owner is not None else self.instance_name, - fullpath=fullpath, - instruction=fullpath + '/' + WRITE - ) - - proxy_resources[fullpath] = ProxyResourceData( - what=ATTRIBUTE, - instruction=fullpath, - module=__file__, - doc=parameter.__doc__, - name=scada_info.obj_name, - qualname=self.__class__.__name__ + '.' + scada_info.obj_name, - # qualname is not correct probably, does not respect inheritance - kwdefaults=None, - defaults=None, - ) - if isinstance(parameter, FileServer): - read_http_method, _ = scada_info.http_method - fileserverpath = "{}/files{}".format(self.full_URL_path_prefix, scada_info.URL_path) - httpserver_resources[read_http_method][fileserverpath] = FileServerData( - what=FILE, - directory=parameter.directory, - fullpath=fileserverpath - ) - instance_resources[fullpath+'/'+READ] = scada_info - instance_resources[fullpath+'/'+WRITE] = scada_info - # The above for-loops can be used only once, the division is only for readability - # _internal_fixed_attributes - allowed to set only once - self._proxy_resources = proxy_resources - self._httpserver_resources = httpserver_resources - self.instance_resources = instance_resources - - - def _prepare_instance(self): - """ - iterates through the members of the Remote Object to identify the information that requires to be supplied - to the HTTPServer, ProxyClient and EventLoop. Called by default in the __init__ of RemoteObject. - """ - self._prepare_resources() - - @httpserver_resources.getter - def _get_httpserver_resources(self) -> typing.Dict[str, typing.Dict[str, typing.Any]]: - return self._httpserver_resources - - @proxy_resources.getter - def _get_proxy_resources(self) -> typing.Dict[str, typing.Dict[str, typing.Any]]: - return self._proxy_resources - - - -class RemoteObject(RemoteSubobject): - """ - Expose your python classes for HTTP methods by subclassing from here. - """ - __server_type__ = ServerTypes.USER_REMOTE_OBJECT - state_machine : StateMachine - - # objects given by user which we need to validate: - eventloop_name = String(default=None, constant=True, - doc = """internally managed, this value is the instance name of the eventloop where the object - is running. Multiple objects can accept requests in a single event loop.""") # type: ignore - log_level = Selector(objects=[logging.DEBUG, logging.INFO, logging.ERROR, - logging.CRITICAL, logging.ERROR], default = logging.INFO, allow_None = False, - doc="""One can either supply a logger or simply set this parameter to to create an internal logger - with specified level.""") # type: ignore - logger = ClassSelector(class_=logging.Logger, default=None, allow_None=True, - doc = """Logger object to print log messages, should be instance of logging.Logger(). default - logger is created if none is supplied.""") # type: ignore - logfile = String (default=None, allow_None=True, - doc="""Logs can be also be stored in a file when a valid filename is passed.""") # type: ignore - logger_remote_access = Boolean(default=False, - doc="""Set it to true to add a default RemoteAccessHandler to the logger""" ) - db_config_file = String (default=None, allow_None=True, - doc="""logs can be also be stored in a file when a valid filename is passed.""") # type: ignore - server_protocols = TupleSelector(default=None, allow_None=True, accept_list=True, - objects=[ZMQ_PROTOCOLS.IPC, ZMQ_PROTOCOLS.TCP], constant=True, - doc="""Protocols to be supported by the ZMQ Server that accepts requests for the RemoteObject - instance. Options are TCP, IPC or both, represented by the Enum ZMQ_PROTOCOLS. - Either pass one or both as list or tuple""") # type: ignore - proxy_serializer = ClassSelector(class_=(SerpentSerializer, JSONSerializer, PickleSerializer, str), # DillSerializer, - default='json', - doc="""The serializer that will be used for passing messages in zmq. For custom data - types which have serialization problems, you can subclass the serializers and implement - your own serialization options. Recommended serializer for exchange messages between - Proxy clients and server is Serpent and for HTTP serializer and server is JSON.""") # type: ignore - json_serializer = ClassSelector(class_=JSONSerializer, default=None, allow_None=True, - doc = """Serializer used for sending messages between HTTP server and remote object, - subclass JSONSerializer to implement undealt serialization options.""") # type: ignore - - # remote paramaters - object_info = RemoteParameter(readonly=True, URL_path='/object-info', - doc="obtained information about this object like the class name, script location etc.") # type: ignore - events : typing.Dict = RemoteParameter(readonly=True, URL_path='/events', - doc="returns a dictionary with two fields " ) # type: ignore - gui_resources : typing.Dict = RemoteParameter(readonly=True, URL_path='/resources/gui', - doc= """object's data read by scadapy webdashboard GUI client, similar to http_resources but differs - in details.""") # type: ignore - GUI = RemoteClassSelector(class_=ReactApp, default=None, allow_None=True, - doc= """GUI applied here will become visible at GUI tab of dashboard tool""") - - - def __new__(cls, **kwargs): - """ - custom defined __new__ method to assign some important attributes at instance creation time directly instead of - super().__init__(instance_name = val1 , users_own_kw_argument1 = users_val1, ..., users_own_kw_argumentn = users_valn) - method. The lowest child's __init__ is always called first and then the code reaches the __init__ of RemoteObject. - Therefore, when the user passes arguments to his own RemoteObject descendent, they have to again pass some required - information (like instance_name) to the __init__ of super() a second time with proper keywords. - To avoid this hassle, we create this __new__. super().__init__() in a descendent is still not optional though. - """ - obj = super().__new__(cls, **kwargs) - # objects given by user which we need to validate (descriptors) - obj.logfile = kwargs.get('logfile', None) - obj.log_level = kwargs.get('log_level', logging.INFO) - obj.logger_remote_access = obj.__class__.logger_remote_access if isinstance(obj.__class__.logger_remote_access, bool) else kwargs.get('logger_remote_access', False) - obj.logger = kwargs.get('logger', None) - obj.db_config_file = kwargs.get('db_config_file', None) - obj.eventloop_name = kwargs.get('eventloop_name', None) - obj.server_protocols = kwargs.get('server_protocols', (ZMQ_PROTOCOLS.IPC, ZMQ_PROTOCOLS.TCP)) - obj.json_serializer = kwargs.get('json_serializer', None) - obj.proxy_serializer = kwargs.get('proxy_serializer', 'json') - return obj - - - def __init__(self, **params) -> None: - # Signature of __new__ and __init__ is generally the same, however one reaches this __init__ - # through the child class. Currently it is not expected to pass the instance_name, log_level etc. - # once through instantian and once again through child class __init__ - for attr in ['instance_name', 'logger', 'log_level', 'logfile', 'db_config_file', 'eventloop_name', - 'server_protocols', 'json_serializer', 'proxy_serializer']: - params.pop(attr, None) - super().__init__(**params) - - # missing type definitions - self.eventloop_name : str - self.logfile : str - self.log_level : int - self.logger : logging.Logger - self.db_engine : RemoteObjectDB - self.server_protocols : typing.Tuple[Enum] - self.json_serializer : JSONSerializer - self.proxy_serializer : BaseSerializer - self.object_info : RemoteObjectDB.RemoteObjectInfo - - self._prepare_message_brokers() - self._prepare_state_machine() - - def __post_init__(self): - super().__post_init__() - # Never create events before _prepare_instance(), no checks in place - self._owner = None - self._prepare_instance() - self._prepare_DB() - self.logger.info("initialialised RemoteObject of class {} with instance name {}".format( - self.__class__.__name__, self.instance_name)) - - def _prepare_message_brokers(self): - self.message_broker = AsyncPollingZMQServer( - instance_name=self.instance_name, - server_type=self.__server_type__, - protocols=self.server_protocols, json_serializer=self.json_serializer, - proxy_serializer=self.proxy_serializer - ) - self.json_serializer = self.message_broker.json_serializer - self.proxy_serializer = self.message_broker.proxy_serializer - self.event_publisher = EventPublisher(identity=self.instance_name, proxy_serializer=self.proxy_serializer, - json_serializer=self.json_serializer) - - def _create_object_info(self, script_path : typing.Optional[str] = None): - if not script_path: - try: - script_path = os.path.dirname(os.path.abspath(inspect.getfile(self.__class__))) - except: - script_path = '' - return RemoteObjectDB.RemoteObjectInfo( - instance_name = self.instance_name, - class_name = self.__class__.__name__, - script = script_path, - http_server = ConfigInfo.USER_MANAGED.name, - args = ConfigInfo.USER_MANAGED.name, - kwargs = ConfigInfo.USER_MANAGED.name, - eventloop_name = self.eventloop_name, - level = 0, - level_type = ConfigInfo.USER_MANAGED.name, - ) - - def _prepare_DB(self): - if not self.db_config_file: - self._object_info = self._create_object_info() - return - # 1. create engine - self.db_engine = RemoteObjectDB(instance_name = self.instance_name, serializer = self.proxy_serializer, - config_file = self.db_config_file) - # 2. create an object metadata to be used by different types of clients - object_info = self.db_engine.fetch_own_info() - if object_info is None: - object_info = self._create_object_info() - self._object_info = object_info - # 3. enter parameters to DB if not already present - if self.object_info.class_name != self.__class__.__name__: - raise ValueError(wrap_text(f""" - Fetched instance name and class name from database not matching with the current RemoteObject class/subclass. - You might be reusing an instance name of another subclass and did not remove the old data from database. - Please clean the database using database tools to start fresh. - """)) - self.db_engine.create_missing_db_parameters(self.__class__.parameters.db_init_objects) - # 4. read db_init and db_persist objects - for db_param in self.db_engine.read_all_parameters(): - try: - setattr(self, db_param.name, self.proxy_serializer.loads(db_param.value)) # type: ignore - except Exception as E: - self.logger.error(f"could not set attribute {db_param.name} due to error {E}") - - def _prepare_state_machine(self): - if hasattr(self, 'state_machine'): - self.state_machine._prepare(self) - self.logger.debug("setup state machine") - - @logger.getter - def _get_logger(self) -> logging.Logger: - return self._logger - - @logger.setter # type: ignore - def _set_logger(self, value : logging.Logger): - if value is None: - self._logger = create_default_logger('{}|{}'.format(self.__class__.__name__, self.instance_name), - self.log_level, self.logfile) - if self.logger_remote_access and not any(isinstance(handler, RemoteAccessHandler) - for handler in self.logger.handlers): - self.remote_access_handler = RemoteAccessHandler(instance_name='logger', maxlen=500, emit_interval=1) - self.logger.addHandler(self.remote_access_handler) - else: - for handler in self.logger.handlers: - if isinstance(handler, RemoteAccessHandler): - self.remote_access_handler = handler - else: - self._logger = value - - @object_info.getter - def _get_object_info(self): - try: - return self._object_info - except AttributeError: - return None - - @events.getter - def _get_events(self) -> typing.Dict[str, typing.Any]: - return { - event._event_unique_str.decode() : dict( - name = event.name, - instruction = event._event_unique_str.decode(), - owner = event.owner.__class__.__name__, - owner_instance_name = event.owner.instance_name, - address = self.event_publisher.socket_address - ) for event in self.event_publisher.events - } - - @gui_resources.getter - def _get_gui_resources(self): - gui_resources = GUIResources( - instance_name=self.instance_name, - events = self.events, - classdoc = self.__class__.__doc__.splitlines() if self.__class__.__doc__ is not None else None, - inheritance = [class_.__name__ for class_ in self.__class__.mro()], - GUI = self.GUI, - ) - for instruction, scada_info in self.instance_resources.items(): - if scada_info.iscallable: - gui_resources.methods[instruction] = self.proxy_resources[instruction].json() - gui_resources.methods[instruction]["scada_info"] = scada_info.json() - # to check - apparently the recursive json() calling does not reach inner depths of a dict, - # therefore we call json ourselves - gui_resources.methods[instruction]["owner"] = self.proxy_resources[instruction].qualname.split('.')[0] - gui_resources.methods[instruction]["owner_instance_name"] = scada_info.bound_obj.instance_name - gui_resources.methods[instruction]["type"] = 'classmethod' if isinstance(scada_info.obj, classmethod) else '' - gui_resources.methods[instruction]["signature"] = get_signature(scada_info.obj)[0] - elif scada_info.isparameter: - path_without_RW = instruction.rsplit('/', 1)[0] - if path_without_RW not in gui_resources.parameters: - gui_resources.parameters[path_without_RW] = self.__class__.parameters.webgui_info(scada_info.obj)[scada_info.obj.name] - gui_resources.parameters[path_without_RW]["instruction"] = path_without_RW - """ - The instruction part has to be cleaned up to be called as fullpath. Setting the full path back into - scada_info is not correct because the unbound method is used by multiple instances. - """ - gui_resources.parameters[path_without_RW]["owner_instance_name"] = scada_info.bound_obj.instance_name - if isinstance(scada_info.obj, PlotlyFigure): - gui_resources.parameters[path_without_RW]['default'] = None - gui_resources.parameters[path_without_RW]['visualization'] = { - 'type' : 'plotly', - 'plot' : scada_info.obj.__get__(self, type(self)), - 'sources' : scada_info.obj.data_sources, - 'actions' : { - scada_info.obj._action_stub.id : scada_info.obj._action_stub - }, - } - elif isinstance(scada_info.obj, Image): - gui_resources.parameters[path_without_RW]['default'] = None - gui_resources.parameters[path_without_RW]['visualization'] = { - 'type' : 'sse-video', - 'sources' : scada_info.obj.data_sources, - 'actions' : { - scada_info.obj._action_stub.id : scada_info.obj._action_stub - }, - } - return gui_resources - - @get(URL_path='/resources/postman-collection') - def postman_collection(self, domain_prefix : str = 'https://localhost:8080') -> postman_collection: - try: - return self._postman_collection - except AttributeError: - pass - parameters_folder = postman_itemgroup(name = 'parameters') - methods_folder = postman_itemgroup(name = 'methods') - events_folder = postman_itemgroup(name = 'events') - - collection = postman_collection( - info = postman_collection_info( - name = self.__class__.__name__, - description = "API endpoints available for Remote Object", - ), - item = [ - parameters_folder, - methods_folder - ] - ) - - for http_method, resource in self.httpserver_resources.items(): - # i.e. this information is generated only on the httpserver accessible resrouces... - for URL_path, httpserver_data in resource.items(): - if isinstance(httpserver_data, HTTPServerResourceData): - scada_info : ScadaInfoData - try: - scada_info = self.instance_resources[httpserver_data.instruction] - except KeyError: - parameter_path_without_RW = httpserver_data.instruction.rsplit('/', 1)[0] - scada_info = self.instance_resources[parameter_path_without_RW] - item = postman_item( - name = scada_info.obj_name, - request = postman_http_request( - description=scada_info.obj.__doc__, - url=domain_prefix + URL_path, - method=http_method, - ) - ) - if scada_info.isparameter: - parameters_folder.add_item(item) - elif scada_info.iscallable: - methods_folder.add_item(item) - - self._postman_collection = collection - return collection - - @post(URL_path='/resources/postman-collection/save') - def save_postman_collection(self, filename : typing.Optional[str] = None) -> None: - if filename is None: - filename = f'{self.__class__.__name__}_postman_collection.json' - with open(filename, 'w') as file: - json.dump(self.postman_collection().json(), file, indent = 4) - - @get('/parameters/names') - def _parameters(self): - return self.parameters.descriptors.keys() - - @get('/parameters/values') - def parameter_values(self, **kwargs) -> typing.Dict[str, typing.Any]: - """ - returns requested parameter values in a dict - """ - data = {} - if 'filter_by' in kwargs: - names = kwargs['filter_by'] - for requested_parameter in names.split(','): - data[requested_parameter] = self.parameters[requested_parameter].__get__(self, type(self)) - elif len(kwargs.keys()) != 0: - for rename, requested_parameter in kwargs.items(): - data[rename] = self.parameters[requested_parameter].__get__(self, type(self)) - else: - for parameter in self.parameters.descriptors.keys(): - data[parameter] = self.parameters[parameter].__get__(self, type(self)) - return data - - @get('/state') - def state(self): - if hasattr(self, 'state_machine'): - return self.state_machine.current_state - else: - return None - - # Example of get and post decorator - @post('/exit') - def exit(self) -> None: - raise BreakInnerLoop - - @get('/ping') - def ping(self) -> bool: - return True - - @get('/test/speed') - def _test_speed(self, value : typing.Any): - """ - This method returns whatever you give allowing speed test of different data types. - The message sent is first serialized by the client, deserialized by the server and in return direction - again serialized by server and deserialized by the client. oneway speed is twice the measured value. - """ - return value - - @get('/test/args/{int_arg:int}/{float_arg:float}/{str_arg:str}') - def _test_path_arguments(self, int_arg, float_arg, str_arg): - self.logger.info(f"passed arguments : int - {int_arg}, float - {float_arg}, str - {str_arg}") - - # example of remote_method decorator - @remote_method(URL_path='/log/console', http_method = POST) - def log_to_console(self, data : typing.Any = None, level : typing.Any = 'DEBUG') -> None: - if level not in log_levels.keys(): - self.logger.error("log level {} invalid. logging with level INFO.".format(level)) - if data is None: - self.logger.log(log_levels.get(level, logging.INFO), "{} is alive.".format(self.instance_name)) - else: - self.logger.log(log_levels.get(level, logging.INFO), "{}".format(data)) - - def query(self, info : typing.Union[str, typing.List[str]]) -> typing.Any: - raise NotImplementedError("arbitrary quering of {} currently not possible".format(self.__class__.__name__)) - - def run(self): - from .eventloop import EventLoop - _eventloop = asyncio.get_event_loop() - _eventloop.run_until_complete(EventLoop.run_single_target(self)) - - - -class ListHandler(logging.Handler): - - def __init__(self, log_list : typing.Optional[typing.List] = None): - super().__init__() - self.log_list : typing.List[typing.Dict] = [] if not log_list else log_list - - def emit(self, record : logging.LogRecord): - log_entry = self.format(record) - self.log_list.insert(0, { - 'level' : record.levelname, - 'timestamp' : datetime.datetime.fromtimestamp(record.created).strftime("%Y-%m-%dT%H:%M:%S.%f"), - 'message' : record.msg - }) - - - -class RemoteAccessHandler(logging.Handler, RemoteSubobject): - - def __init__(self, maxlen : int = 100, emit_interval : float = 1.0, **kwargs) -> None: - logging.Handler.__init__(self) - # self._last_time = datetime.datetime.now() - if not isinstance(emit_interval, (float, int)) or emit_interval < 1.0: - raise TypeError("Specify log emit interval as number greater than 1.0") - else: - self.emit_interval = emit_interval # datetime.timedelta(seconds=1.0) if not emit_interval else datetime.timedelta(seconds=emit_interval) - self.event = Event('log-events') - self.diff_logs = [] - self.maxlen = maxlen - self._push_events = False - - def get_maxlen(self): - return self._maxlen - - def set_maxlen(self, value): - self._maxlen = value - self._debug_logs = deque(maxlen=value) - self._info_logs = deque(maxlen=value) - self._warn_logs = deque(maxlen=value) - self._error_logs = deque(maxlen=value) - self._critical_logs = deque(maxlen=value) - self._execution_logs = deque(maxlen=value) - - maxlen = RemoteInteger(default=100, bounds=(1, None), crop_to_bounds=True, URL_path='/maxlen', - fget=get_maxlen, fset=set_maxlen ) - - - @post('/events/start/{type:str}') - def push_events(self, type : str, interval : float): - self.emit_interval = interval # datetime.timedelta(seconds=interval) - self._push_events = True - print("log event type", type) - if type == 'asyncio': - asyncio.get_event_loop().call_soon(lambda : asyncio.create_task(self.async_push_diff_logs())) - else: - self._events_thread = threading.Thread(target=self.push_diff_logs) - self._events_thread.start() - - @post('/events/stop') - def stop_events(self): - self._push_events = False - if self._events_thread: # No need to cancel asyncio event separately - self._events_thread.join() - self._owner.logger.debug(f"joined logg event source with thread-id {self._events_thread.ident}") - self._events_thread = None - - def emit(self, record : logging.LogRecord): - log_entry = self.format(record) - info = { - 'level' : record.levelname, - 'timestamp' : datetime.datetime.fromtimestamp(record.created).strftime("%Y-%m-%dT%H:%M:%S.%f"), - 'message' : record.msg - } - if record.levelno < logging.INFO: - self._debug_logs.appendleft(info) - elif record.levelno >= logging.INFO and record.levelno < logging.WARN: - self._info_logs.appendleft(info) - elif record.levelno >= logging.WARN and record.levelno < logging.ERROR: - self._warn_logs.appendleft(info) - elif record.levelno >= logging.ERROR and record.levelno < logging.CRITICAL: - self._error_logs.appendleft(info) - elif record.levelno >= logging.CRITICAL: - self._critical_logs.appendleft(info) - self._execution_logs.appendleft(info) - - if self._push_events: - self.diff_logs.insert(0, info) - - def push_diff_logs(self): - while self._push_events: - # if datetime.datetime.now() - self._last_time > self.emit_interval and len(self.diff_logs) > 0: - time.sleep(self.emit_interval) - self.event.push(self.diff_logs) - self.diff_logs.clear() - # give time to collect final logs with certainty - self._owner.logger.info(f"ending log event source with thread-id {threading.get_ident()}") - time.sleep(self.emit_interval) - if self.diff_logs: - self.event.push(self.diff_logs) - self.diff_logs.clear() - # self._last_time = datetime.datetime.now() - - async def async_push_diff_logs(self): - while self._push_events: - # if datetime.datetime.now() - self._last_time > self.emit_interval and len(self.diff_logs) > 0: - await asyncio.sleep(self.emit_interval) - self.event.push(self.diff_logs) - self.diff_logs.clear() - # give time to collect final logs with certainty - await asyncio.sleep(self.emit_interval) - if self.diff_logs: - self.event.push(self.diff_logs) - self.diff_logs.clear() - # self._last_time = datetime.datetime.now() - - debug_logs = RemoteParameter(readonly=True, URL_path='/logs/debug') - @debug_logs.getter - def get_debug_logs(self): - return self._debug_logs - - warn_logs = RemoteParameter(readonly=True, URL_path='/logs/warn') - @warn_logs.getter - def get_warn_logs(self): - return self._warn_logs - - info_logs = RemoteParameter(readonly=True, URL_path='/logs/info') - @info_logs.getter - def get_info_logs(self): - return self._info_logs - - error_logs = RemoteParameter(readonly=True, URL_path='/logs/error') - @error_logs.getter - def get_error_logs(self): - return self._error_logs - - critical_logs = RemoteParameter(readonly=True, URL_path='/logs/critical') - @critical_logs.getter - def get_critical_logs(self): - return self._critical_logs - - execution_logs = RemoteParameter(readonly=True, URL_path='/logs/execution') - @execution_logs.getter - def get_execution_logs(self): - return self._execution_logs - - - -__all__ = ['RemoteObject', 'StateMachine', 'RemoteObjectDB', 'RemoteSubobject', 'ListHandler', 'RemoteAccessHandler'] diff --git a/hololinked/server/remote_parameter.py b/hololinked/server/remote_parameter.py deleted file mode 100644 index 47781cd..0000000 --- a/hololinked/server/remote_parameter.py +++ /dev/null @@ -1,438 +0,0 @@ -import typing -import os -from enum import Enum - -from ..param.parameterized import Parameter, Parameterized, ClassParameters, raise_TypeError -from ..param.exceptions import raise_ValueError -from .decorators import ScadaInfoValidator -from .constants import GET, PUT, USE_OBJECT_NAME -from .zmq_message_brokers import Event - -try: - import plotly.graph_objects as go -except: - go = None - -__default_parameter_write_method__ = PUT - -__parameter_info__ = [ - 'allow_None' , 'class_member', 'constant', 'db_commit', - 'db_first_load', 'db_memorized', 'deepcopy_default', 'per_instance_descriptor', - 'default', 'doc', 'metadata', 'name', 'readonly' - # 'scada_info', 'parameter_type' # descriptor related info is also necessary - ] - - - -class RemoteParameter(Parameter): - - __slots__ = ['db_persist', 'db_init', 'db_commit', 'scada_info'] - - def __init__(self, default: typing.Any = None, *, doc : typing.Optional[str] = None, constant : bool = False, - readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, - http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (GET, PUT), - state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - class_member : bool = False, fget : typing.Optional[typing.Callable] = None, - fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, - deepcopy_default : bool = False, per_instance_descriptor : bool = False, - precedence : typing.Optional[float] = None, - ) -> None: - """Initialize a new Parameter object and store the supplied attributes: - - default: the owning class's value for the attribute represented - by this Parameter, which can be overridden in an instance. - - doc: docstring explaining what this parameter represents. - - constant: if true, the Parameter value can be changed only at - the class level or in a Parameterized constructor call. The - value is otherwise constant on the Parameterized instance, - once it has been constructed. - - readonly: if true, the Parameter value cannot ordinarily be - changed by setting the attribute at the class or instance - levels at all. The value can still be changed in code by - temporarily overriding the value of this slot and then - restoring it, which is useful for reporting values that the - _user_ should never change but which do change during code - execution. - - allow_None: if True, None is accepted as a valid value for - this Parameter, in addition to any other values that are - allowed. If the default value is defined as None, allow_None - is set to True automatically. - - db_memorized: if True, every read and write is stored in database - and persists instance destruction and creation. - - db_firstload: if True, only the first read is loaded from database. - further reads and writes not written to database. if db_memorized - is True, this value is ignored. - - optional: some doc - - remote: set False to avoid exposing the variable for remote read - and write - - URL: resource locator under which the attribute is accessible through - HTTP. when remote is True and no value is supplied, the variable name - is used and underscores and replaced with dash - - read_method: HTTP method for attribute read, default is GET - - write_method: HTTP method for attribute read, default is PUT - - read_time : some doc - - metadata: store your own JSON compatible metadata for the parameter - which gives useful (and modifiable) information about the parameter. - - label: optional text label to be used when this Parameter is - shown in a listing. If no label is supplied, the attribute name - for this parameter in the owning Parameterized object is used. - - per_instance: whether a separate Parameter instance will be - created for every Parameterized instance. True by default. - If False, all instances of a Parameterized class will share - the same Parameter object, including all validation - attributes (bounds, etc.). See also deep_copy, which is - conceptually similar but affects the Parameter value rather - than the Parameter object. - - deep_copy: controls whether the value of this Parameter will - be deepcopied when a Parameterized object is instantiated (if - True), or if the single default value will be shared by all - Parameterized instances (if False). For an immutable Parameter - value, it is best to leave deep_copy at the default of - False, so that a user can choose to change the value at the - Parameterized instance level (affecting only that instance) or - at the Parameterized class or superclass level (affecting all - existing and future instances of that class or superclass). For - a mutable Parameter value, the default of False is also appropriate - if you want all instances to share the same value state, e.g. if - they are each simply referring to a single global object like - a singleton. If instead each Parameterized should have its own - independently mutable value, deep_copy should be set to - True, but note that there is then no simple way to change the - value of this Parameter at the class or superclass level, - because each instance, once created, will then have an - independently deepcopied value. - - class_member : To make a ... - - pickle_default_value: whether the default value should be - pickled. Usually, you would want the default value to be pickled, - but there are rare cases where that would not be the case (e.g. - for file search paths that are specific to a certain system). - - precedence: a numeric value, usually in the range 0.0 to 1.0, - which allows the order of Parameters in a class to be defined in - a listing or e.g. in GUI menus. A negative precedence indicates - a parameter that should be hidden in such listings. - - default, doc, and precedence all default to None, which allows - inheritance of Parameter slots (attributes) from the owning-class' - class hierarchy (see ParameterizedMetaclass). - """ - - super().__init__(default=default, doc=doc, constant=constant, readonly=readonly, allow_None=allow_None, - per_instance_descriptor=per_instance_descriptor, deepcopy_default=deepcopy_default, - class_member=class_member, fget=fget, fset=fset, precedence=precedence) - self.db_persist = db_persist - self.db_init = db_init - self.db_commit = db_commit - if URL_path is not USE_OBJECT_NAME: - assert URL_path.startswith('/'), "URL path should start with a leading '/'" - self.scada_info = ScadaInfoValidator( - http_method = http_method, - URL_path = URL_path, - state = state, - isparameter = True - ) - - def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: - if slot == 'owner' and self.owner is not None: - if self.scada_info.URL_path == USE_OBJECT_NAME: - self.scada_info.URL_path = '/' + self.name - self.scada_info.obj_name = self.name - # In principle the above could be done when setting name itself however to simplify - # we do it with owner. So we should always remember order of __set_name__ -> 1) attrib_name, - # 2) name and then 3) owner - super()._post_slot_set(slot, old, value) - - def _post_value_set(self, obj : Parameterized, value : typing.Any) -> None: - if (self.db_persist or self.db_commit) and hasattr(obj, 'db_engine') and hasattr(obj.db_engine, 'edit_parameter'): - obj.db_engine.edit_parameter(self, value) - return super()._post_value_set(obj, value) - - def query(self, info : typing.Union[str, typing.List[str]]) -> typing.Any: - if info == 'info': - state = self.__getstate__() - overloads = state.pop('overloads') - state["overloads"] = {"custom fset" : repr(overloads["fset"]) , "custom fget" : repr(overloads["fget"])} - owner_cls = state.pop('owner') - state["owner"] = repr(owner_cls) - return state - elif info in self.__slots__ or info in self.__parent_slots__: - if info == 'overloads': - overloads = getattr(self, info) - return {"custom fset" : repr(overloads["fset"]) , "custom fget" : repr(overloads["fget"])} - elif info == 'owner': - return repr(getattr(self, info)) - else: - return getattr(self, info) - elif isinstance(info, list): - requested_info = {} - for info_ in info: - if not isinstance(info_, str): - raise AttributeError("Invalid format for information : {} found in list of requested information. Only string is allowed".format(type(info_))) - requested_info[info_] = getattr(self, info_) - return requested_info - else: - raise AttributeError("requested information {} not found in parameter {}".format(info, self.name)) - - - -class VisualizationParameter(RemoteParameter): - # type shield from RemoteParameter - pass - - - -class PlotlyFigure(VisualizationParameter): - - __slots__ = ['data_sources', 'update_event_name', 'refresh_interval', 'polled', - '_action_stub'] - - def __init__(self, default_figure, *, - data_sources : typing.Dict[str, typing.Union[RemoteParameter, typing.Any]], - polled : bool = False, refresh_interval : typing.Optional[int] = None, - update_event_name : typing.Optional[str] = None, doc: typing.Union[str, None] = None, - URL_path : str = USE_OBJECT_NAME) -> None: - super().__init__(default=default_figure, doc=doc, constant=True, readonly=True, URL_path=URL_path, - http_method=(GET, PUT)) - self.data_sources = data_sources - self.refresh_interval = refresh_interval - self.update_event_name = update_event_name - self.polled = polled - - def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: - if slot == 'owner' and self.owner is not None: - from ..webdashboard import RepeatedRequests, AxiosRequestConfig, EventSource - if self.polled: - if self.refresh_interval is None: - raise ValueError(f'for PlotlyFigure {self.name}, set refresh interval (ms) since its polled') - request = AxiosRequestConfig( - url=f'/parameters?{"&".join(f"{key}={value}" for key, value in self.data_sources.items())}', - # Here is where graphQL is very useful - method='get' - ) - self._action_stub = RepeatedRequests( - requests=request, - interval=self.refresh_interval, - ) - elif self.update_event_name: - if not isinstance(self.update_event_name, str): - raise ValueError(f'update_event_name for PlotlyFigure {self.name} must be a string') - request = EventSource(f'/event/{self.update_event_name}') - self._action_stub = request - else: - pass - - for field, source in self.data_sources.items(): - if isinstance(source, RemoteParameter): - if isinstance(source, EventSource): - raise RuntimeError("Parameter field not supported for event source, give str") - self.data_sources[field] = request.response[source.name] - elif isinstance(source, str): - if isinstance(source, RepeatedRequests) and source not in self.owner.parameters: # should be in remote parameters, not just parameter - raise ValueError(f'data_sources must be a string or RemoteParameter, type {type(source)} has been found') - self.data_sources[field] = request.response[source] - else: - raise ValueError(f'given source {source} invalid. Specify str for events or Parameter') - - return super()._post_slot_set(slot, old, value) - - def validate_and_adapt(self, value : typing.Any) -> typing.Any: - if self.allow_None and value is None: - return - if not go: - raise ImportError("plotly was not found/imported, install plotly to suport PlotlyFigure paramater") - if not isinstance(value, go.Figure): - raise_TypeError(f"figure arguments accepts only plotly.graph_objects.Figure, not type {type(value)}", - self) - return value - - @classmethod - def serialize(cls, value): - return value.to_json() - - - -class Image(VisualizationParameter): - - __slots__ = ['event', 'streamable', '_action_stub', 'data_sources'] - - def __init__(self, default : typing.Any = None, *, streamable : bool = True, doc : typing.Optional[str] = None, - constant : bool = False, readonly : bool = False, allow_None : bool = False, - URL_path : str = USE_OBJECT_NAME, - http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (GET, PUT), - state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, - db_persist : bool = False, db_init : bool = False, db_commit : bool = False, - class_member : bool = False, fget : typing.Optional[typing.Callable] = None, - fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, - deepcopy_default : bool = False, per_instance_descriptor : bool = False, - precedence : typing.Optional[float] = None) -> None: - super().__init__(default, doc=doc, constant=constant, readonly=readonly, allow_None=allow_None, - URL_path=URL_path, http_method=http_method, state=state, - db_persist=db_persist, db_init=db_init, db_commit=db_commit, class_member=class_member, - fget=fget, fset=fset, fdel=fdel, deepcopy_default=deepcopy_default, - per_instance_descriptor=per_instance_descriptor, precedence=precedence) - self.streamable = streamable - - def __set_name__(self, owner : typing.Any, attrib_name : str) -> None: - super().__set_name__(owner, attrib_name) - self.event = Event(attrib_name) - - def _post_value_set(self, obj : Parameterized, value : typing.Any) -> None: - super()._post_value_set(obj, value) - if value is not None: - print(f"pushing event {value[0:100]}") - self.event.push(value, serialize=False) - - def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: - if slot == 'owner' and self.owner is not None: - from ..webdashboard import SSEVideoSource - request = SSEVideoSource(f'/event/image') - self._action_stub = request - self.data_sources = request.response - return super()._post_slot_set(slot, old, value) - - - - -class FileServer(RemoteParameter): - - __slots__ = ['directory'] - - def __init__(self, directory : str, *, doc : typing.Optional[str] = None, URL_path : str = USE_OBJECT_NAME, - class_member: bool = False, per_instance_descriptor: bool = False) -> None: - self.directory = self.validate_and_adapt_directory(directory) - super().__init__(default=self.load_files(self.directory), doc=doc, URL_path=URL_path, constant=True, - class_member=class_member, per_instance_descriptor=per_instance_descriptor) - - def validate_and_adapt_directory(self, value : str): - if not isinstance(value, str): - raise_TypeError(f"FileServer parameter not a string, but type {type(value)}", self) - if not os.path.isdir(value): - raise_ValueError(f"FileServer parameter directory '{value}' not a valid directory", self) - if not value.endswith('\\'): - value += '\\' - return value - - def load_files(self, directory : str): - return [f for f in os.listdir(directory) if os.path.isfile(os.path.join(directory, f))] - -class DocumentationFolder(FileServer): - - def __init__(self, directory : str, *, doc : typing.Optional[str] = None, URL_path : str = '/documentation', - class_member: bool = False, per_instance_descriptor: bool = False) -> None: - super().__init__(directory=directory, doc=doc, URL_path=URL_path, - class_member=class_member, per_instance_descriptor=per_instance_descriptor) - - - -class RemoteClassParameters(ClassParameters): - - @property - def db_persisting_objects(self): - try: - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params') - except AttributeError: - paramdict = self.remote_objects - db_persisting_remote_params = {} - for name, desc in paramdict.items(): - if desc.db_persist: - db_persisting_remote_params[name] = desc - setattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params', db_persisting_remote_params) - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_persisting_remote_params') - - @property - def db_init_objects(self) -> typing.Dict[str, RemoteParameter]: - try: - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params') - except AttributeError: - paramdict = self.remote_objects - init_load_params = {} - for name, desc in paramdict.items(): - if desc.db_init or desc.db_persist: - init_load_params[name] = desc - setattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params', init_load_params) - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_db_init_remote_params') - - @property - def remote_objects(self) -> typing.Dict[str, RemoteParameter]: - try: - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params') - except AttributeError: - paramdict = super().descriptors - remote_params = {} - for name, desc in paramdict.items(): - if isinstance(desc, RemoteParameter): - remote_params[name] = desc - setattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params', remote_params) - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_remote_params') - - def webgui_info(self, for_remote_params : typing.Union[RemoteParameter, typing.Dict[str, RemoteParameter], None] = None): - info = {} - if isinstance(for_remote_params, dict): - objects = for_remote_params - elif isinstance(for_remote_params, RemoteParameter): - objects = { for_remote_params.name : for_remote_params } - else: - objects = self.remote_objects - for param in objects.values(): - state = param.__getstate__() - info[param.name] = dict( - scada_info = state.get("scada_info", None).create_dataclass(), - type = param.__class__.__name__, - owner = param.owner.__name__ - ) - for field in __parameter_info__: - info[param.name][field] = state.get(field, None) - return info - - @property - def visualization_parameters(self): - try: - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params') - except AttributeError: - paramdict = super().descriptors - visual_params = {} - for name, desc in paramdict.items(): - if isinstance(desc, VisualizationParameter): - visual_params[name] = desc - setattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params', visual_params) - return getattr(self.owner_cls, f'_{self.owner_cls.__name__}_visualization_params') - - - - -class batch_db_commit: - - def __enter__(self): - pass - - def __exit__(self): - pass - - - -class ReactApp: - pass - - - -__all__ = ['RemoteParameter'] \ No newline at end of file diff --git a/hololinked/server/serializers.py b/hololinked/server/serializers.py index e184f7f..b27a045 100644 --- a/hololinked/server/serializers.py +++ b/hololinked/server/serializers.py @@ -22,152 +22,84 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ - -from collections import deque import json import pickle -import traceback -import serpent -# import dill +from msgspec import json, msgpack +import json as pythonjson import inspect import array import datetime import uuid import decimal import typing +import warnings from enum import Enum +from collections import deque from ..param.parameters import TypeConstrainedList, TypeConstrainedDict, TypedKeyMappingsConstrainedDict - - -dict_keys = type(dict().keys()) +from .constants import JSONSerializable, Serializers +from .utils import format_exception_as_json class BaseSerializer(object): - """Base class for (de)serializer implementations (which must be thread safe)""" - serializer_id = 0 # define uniquely in subclass + """ + Base class for (de)serializer implementations. All serializers must inherit this class + and overload dumps() and loads() to be usable by the ZMQ message brokers. Any serializer + that returns bytes when serialized and a python object on deserialization will be accepted. + Serialization and deserialization errors will be passed as invalid message type + (see ZMQ messaging contract) from server side and a exception will be raised on the client. + """ + + def __init__(self) -> None: + super().__init__() + self.type = None - def loads(self, data): + def loads(self, data) -> typing.Any: + "method called by ZMQ message brokers to deserialize data" raise NotImplementedError("implement in subclass") - def dumps(self, data): + def dumps(self, data) -> bytes: + "method called by ZMQ message brokers to serialize data" raise NotImplementedError("implement in subclass") - def _convertToBytes(self, data): - if type(data) is bytearray: + def convert_to_bytes(self, data) -> bytes: + if isinstance(data, bytes): + return data + if isinstance(data, bytearray): return bytes(data) - if type(data) is memoryview: + if isinstance(data, memoryview): return data.tobytes() - return data - - @classmethod - def register_class_to_dict(cls, clazz, converter, serpent_too=True): - """Registers a custom function that returns a dict representation of objects of the given class. - The function is called with a single parameter; the object to be converted to a dict.""" - raise NotImplementedError("Function register_class_to_dict has to be implemented") - - @classmethod - def unregister_class_to_dict(cls, clazz): - """Removes the to-dict conversion function registered for the given class. Objects of the class - will be serialized by the default mechanism again.""" - raise NotImplementedError("Function unregister_class_to_dict has to be implemented") - - @classmethod - def register_dict_to_class(cls, classname, converter): - """ - Registers a custom converter function that creates objects from a dict with the given classname tag in it. - The function is called with two parameters: the classname and the dictionary to convert to an instance of the class. - """ - raise NotImplementedError("Function register_dict_to_class has to be implemented") - - @classmethod - def unregister_dict_to_class(cls, classname): - """ - Removes the converter registered for the given classname. Dicts with that classname tag - will be deserialized by the default mechanism again. - """ - raise NotImplementedError("Function unregister_dict_to_class has to be implemented") - - @classmethod - def class_to_dict(cls, obj): - """ - Convert a non-serializable object to a dict. Partly borrowed from serpent. - """ - raise NotImplementedError("Function class_to_dict has to be implemented") - - @classmethod - def dict_to_class(cls, data): - """ - Recreate an object out of a dict containing the class name and the attributes. - Only a fixed set of classes are recognized. - """ - raise NotImplementedError("Function dict_to_class has to be implemented") - - def recreate_classes(self, literal): - raise NotImplementedError("Function class_to_dict has to be implemented") - + raise TypeError("serializer convert_to_bytes accepts only bytes, bytearray or memoryview") + +dict_keys = type(dict().keys()) class JSONSerializer(BaseSerializer): - """(de)serializer that wraps the json serialization protocol.""" - serializer_id = 1 # never change this + "(de)serializer that wraps the msgspec JSON serialization protocol, default serializer for all clients." + _type_replacements = {} - def __init__(self, use_json_method = True, split_multiline_str = True) -> None: - self.use_json_method = use_json_method - self.prettify_strings = split_multiline_str + def __init__(self) -> None: super().__init__() + self.type = json - def dumps(self, data) -> bytes: - data = json.dumps(data, ensure_ascii=False, allow_nan=True, default=self.default) - return data.encode("utf-8") - - def dump(self, data : typing.Dict[str, typing.Any], file_desc) -> None: - json.dump(data, file_desc, ensure_ascii=False, allow_nan=True, default=self.default) - - def loads(self, data : typing.Union[bytearray, memoryview, bytes]) -> typing.Any: - data : str = self._convertToBytes(data).decode("utf-8") - try: - return json.loads(data) - except: - if len(data) == 0: - raise - elif data[0].isalpha(): - return data - else: - raise - - def load(cls, file_desc) -> typing.Dict[str, typing.Any]: - return json.load(file_desc) - - @classmethod - def general_dumps(cls, data) -> bytes: - data = json.dumps(data, ensure_ascii=False, allow_nan = True) - return data.encode("utf-8") - - @classmethod - def general_dump(cls, data : typing.Dict[str, typing.Any], file_desc) -> None: - json.dump(data, file_desc, ensure_ascii = False, allow_nan = True) - - @classmethod - def general_loads(cls, data : typing.Union[bytearray, memoryview, bytes]) -> typing.Dict[str, typing.Any]: - data = cls._convertToBytes(data).decode("utf-8") # type: ignore - return json.loads(data) + def loads(self, data : typing.Union[bytearray, memoryview, bytes]) -> JSONSerializable: + "method called by ZMQ message brokers to deserialize data" + return json.decode(self.convert_to_bytes(data)) + def dumps(self, data) -> bytes: + "method called by ZMQ message brokers to serialize data" + return json.encode(data, enc_hook=self.default) + @classmethod - def general_load(cls, file_desc) -> typing.Dict[str, typing.Any]: - return json.load(file_desc) - - def default(self, obj): - replacer = self._type_replacements.get(type(obj), None) - if replacer: - obj = replacer(obj) - if isinstance(obj, Enum): - return obj.name - if self.use_json_method and hasattr(obj, 'json'): + def default(cls, obj) -> JSONSerializable: + "method called if no serialization option was found." + if hasattr(obj, 'json'): # alternative to type replacement return obj.json() + if isinstance(obj, Enum): + return obj.name if isinstance(obj, (set, dict_keys, deque, tuple)): # json module can't deal with sets so we make a tuple out of it return list(obj) @@ -180,85 +112,155 @@ def default(self, obj): if isinstance(obj, decimal.Decimal): return str(obj) if isinstance(obj, Exception): - return { - "message" : str(obj), - "type" : repr(obj).split('(', 1)[0], - "traceback" : traceback.format_exc().splitlines(), - "notes" : obj.__notes__ if hasattr(obj, "__notes__") else None - }, - if hasattr(obj, 'to_json'): - return obj.to_json() + return format_exception_as_json(obj) if isinstance(obj, array.array): if obj.typecode == 'c': return obj.tostring() if obj.typecode == 'u': return obj.tounicode() return obj.tolist() + replacer = cls._type_replacements.get(type(obj), None) + if replacer: + return replacer(obj) raise TypeError("Given type cannot be converted to JSON : {}".format(type(obj))) - # return self.class_to_dict(obj) - + @classmethod - def register_type_replacement(cls, object_type, replacement_function): + def register_type_replacement(cls, object_type, replacement_function) -> None: + "register custom serialization function for a particular type" if object_type is type or not inspect.isclass(object_type): raise ValueError("refusing to register replacement for a non-type or the type 'type' itself") cls._type_replacements[object_type] = replacement_function +class PythonBuiltinJSONSerializer(JSONSerializer): + "(de)serializer that wraps the python builtin JSON serialization protocol." -class PickleSerializer(BaseSerializer): + def __init__(self) -> None: + super().__init__() + self.type = pythonjson + + def loads(self, data : typing.Union[bytearray, memoryview, bytes]) -> typing.Any: + "method called by ZMQ message brokers to deserialize data" + return pythonjson.loads(self.convert_to_bytes(data)) + + def dumps(self, data) -> bytes: + "method called by ZMQ message brokers to serialize data" + data = pythonjson.dumps(data, ensure_ascii=False, allow_nan=True, default=self.default) + return data.encode("utf-8") + + def dump(self, data : typing.Dict[str, typing.Any], file_desc) -> None: + "write JSON to file" + pythonjson.dump(data, file_desc, ensure_ascii=False, allow_nan=True, default=self.default) - serializer_id = 2 # never change this + def load(cls, file_desc) -> JSONSerializable: + "load JSON from file" + return pythonjson.load(file_desc) - def dumps(self, data): - return pickle.dumps(data) - - def loads(self, data): - return pickle.loads(data) - -# class DillSerializer(BaseSerializer): +class PickleSerializer(BaseSerializer): + "(de)serializer that wraps the pickle serialization protocol, use with encryption for safety." -# serializer_id = 4 + def __init__(self) -> None: + super().__init__() + self.type = pickle -# def dumps(self, data): -# return dill.dumps(data) + def dumps(self, data) -> bytes: + "method called by ZMQ message brokers to serialize data" + return pickle.dumps(data) -# def loads(self, data): -# return dill.loads(data) + def loads(self, data) -> typing.Any: + "method called by ZMQ message brokers to deserialize data" + return pickle.loads(self.convert_to_bytes(data)) -class SerpentSerializer(BaseSerializer): - """(de)serializer that wraps the serpent serialization protocol.""" - serializer_id = 3 # never change this - - def dumps(self, data): - return serpent.dumps(data, module_in_classname=True) +class MsgpackSerializer(BaseSerializer): + """ + (de)serializer that wraps the msgspec MessagePack serialization protocol, recommended serializer for ZMQ based + high speed applications. Set an instance of this serializer to both ``Thing.rpc_serializer`` and + ``hololinked.client.ObjectProxy``. + """ - def loads(self, data): - return serpent.loads(data) - - @classmethod - def register_type_replacement(cls, object_type, replacement_function): - def custom_serializer(obj, serpent_serializer, outputstream, indentlevel): - replaced = replacement_function(obj) - if replaced is obj: - serpent_serializer.ser_default_class(replaced, outputstream, indentlevel) - else: - serpent_serializer._serialize(replaced, outputstream, indentlevel) - - if object_type is type or not inspect.isclass(object_type): - raise ValueError("refusing to register replacement for a non-type or the type 'type' itself") - serpent.register_class(object_type, custom_serializer) + def __init__(self) -> None: + super().__init__() + self.type = msgpack + def dumps(self, value) -> bytes: + return msgpack.encode(value) + def loads(self, value) -> typing.Any: + return msgpack.decode(self.convert_to_bytes(value)) + serializers = { + None : JSONSerializer, + 'json' : JSONSerializer, 'pickle' : PickleSerializer, - # 'dill' : DillSerializer, - 'JSON' : JSONSerializer, - 'Serpent' : SerpentSerializer, - None : SerpentSerializer + 'msgpack' : MsgpackSerializer, } + -__all__ = [ 'JSONSerializer', 'SerpentSerializer', 'PickleSerializer', # 'DillSerializer', - 'serializers', 'BaseSerializer'] \ No newline at end of file +try: + import serpent + + class SerpentSerializer(BaseSerializer): + """(de)serializer that wraps the serpent serialization protocol.""" + + def __init__(self) -> None: + super().__init__() + self.type = serpent + + def dumps(self, data) -> bytes: + "method called by ZMQ message brokers to serialize data" + return serpent.dumps(data, module_in_classname=True) + + def loads(self, data) -> typing.Any: + "method called by ZMQ message brokers to deserialize data" + return serpent.loads(self.convert_to_bytes(data)) + + @classmethod + def register_type_replacement(cls, object_type, replacement_function) -> None: + "register custom serialization function for a particular type" + def custom_serializer(obj, serpent_serializer, outputstream, indentlevel): + replaced = replacement_function(obj) + if replaced is obj: + serpent_serializer.ser_default_class(replaced, outputstream, indentlevel) + else: + serpent_serializer._serialize(replaced, outputstream, indentlevel) + + if object_type is type or not inspect.isclass(object_type): + raise ValueError("refusing to register replacement for a non-type or the type 'type' itself") + serpent.register_class(object_type, custom_serializer) + + serializers['serpent'] = SerpentSerializer, +except ImportError: + pass + + + +def _get_serializer_from_user_given_options( + rpc_serializer : typing.Union[str, BaseSerializer], + json_serializer : typing.Union[str, JSONSerializer] + ) -> typing.Tuple[BaseSerializer, JSONSerializer]: + """ + We give options to specify serializer as a string or an object, + """ + if json_serializer in [None, 'json'] or isinstance(json_serializer, JSONSerializer): + json_serializer = json_serializer if isinstance(json_serializer, JSONSerializer) else JSONSerializer() + else: + raise ValueError("invalid JSON serializer option : {}".format(json_serializer)) + if isinstance(rpc_serializer, BaseSerializer): + rpc_serializer = rpc_serializer + if isinstance(rpc_serializer, PickleSerializer) or rpc_serializer.type == pickle: + warnings.warn("using pickle serializer which is unsafe, consider another like msgpack.", UserWarning) + elif rpc_serializer == 'json' or rpc_serializer is None: + rpc_serializer = json_serializer + elif isinstance(rpc_serializer, str): + rpc_serializer = serializers.get(rpc_serializer, JSONSerializer)() + else: + raise ValueError("invalid rpc serializer option : {}".format(rpc_serializer)) + return rpc_serializer, json_serializer + + + +__all__ = ['JSONSerializer', 'PickleSerializer', 'MsgpackSerializer', + 'serializers', 'BaseSerializer'] \ No newline at end of file diff --git a/hololinked/server/state_machine.py b/hololinked/server/state_machine.py new file mode 100644 index 0000000..ecda160 --- /dev/null +++ b/hololinked/server/state_machine.py @@ -0,0 +1,205 @@ +import typing +import inspect +from types import FunctionType, MethodType +from enum import EnumMeta, Enum, StrEnum + +from ..param.parameterized import Parameterized +from .utils import getattr_without_descriptor_read +from .data_classes import RemoteResourceInfoValidator +from .property import Property +from .properties import ClassSelector, TypedDict, Boolean +from .events import Event + + + +class StateMachine: + """ + A container class for state machine related information. Each ``Thing`` class can only have one state machine + instantiated in a reserved class-level attribute ``state_machine``. The ``state`` attribute defined at the ``Thing`` + can be subscribed for state change events from this state machine. + """ + initial_state = ClassSelector(default=None, allow_None=True, constant=True, class_=(Enum, str), + doc="initial state of the machine") # type: typing.Union[Enum, str] + states = ClassSelector(default=None, allow_None=True, constant=True, class_=(EnumMeta, tuple, list), + doc="list/enum of allowed states") # type: typing.Union[EnumMeta, tuple, list] + on_enter = TypedDict(default=None, allow_None=True, key_type=str, + doc="""callbacks to execute when a certain state is entered; + specfied as map with state as keys and callbacks as list""") # typing.Dict[str, typing.List[typing.Callable]] + on_exit = TypedDict(default=None, allow_None=True, key_type=str, + doc="""callbacks to execute when certain state is exited; + specfied as map with state as keys and callbacks as list""") # typing.Dict[str, typing.List[typing.Callable]] + machine = TypedDict(default=None, allow_None=True, key_type=str, item_type=(list, tuple), + doc="the machine specification with state as key and objects as list") # typing.Dict[str, typing.List[typing.Callable, Property]] + valid = Boolean(default=False, readonly=True, fget=lambda self: self._valid, + doc="internally computed, True if states, initial_states and the machine is valid") + + def __init__(self, + states : typing.Union[EnumMeta, typing.List[str], typing.Tuple[str]], *, + initial_state : typing.Union[StrEnum, str], push_state_change_event : bool = True, + on_enter : typing.Dict[str, typing.Union[typing.List[typing.Callable], typing.Callable]] = {}, + on_exit : typing.Dict[str, typing.Union[typing.List[typing.Callable], typing.Callable]] = {}, + **machine : typing.Dict[str, typing.Union[typing.Callable, Property]] + ) -> None: + """ + Parameters + ---------- + states: Enum + enumeration of states + initial_state: str + initial state of machine + push_state_change_event : bool, default True + when the state changes, an event is pushed with the new state + on_enter: Dict[str, Callable | Property] + callbacks to be invoked when a certain state is entered. It is to be specified + as a dictionary with the states being the keys + on_exit: Dict[str, Callable | Property] + callbacks to be invoked when a certain state is exited. + It is to be specified as a dictionary with the states being the keys + **machine: + state name: List[Callable, Property] + directly pass the state name as an argument along with the methods/properties which are allowed to execute + in that state + """ + self._valid = False + self.on_enter = on_enter + self.on_exit = on_exit + # None cannot be passed in, but constant is necessary. + self.states = states + self.initial_state = initial_state + self.machine = machine + self.state_change_event = None + if push_state_change_event: + self.state_change_event = Event('state-change') + + def _prepare(self, owner : Parameterized) -> None: + if self.states is None and self.initial_state is None: + self._valid = False + self._state = None + return + elif self.initial_state not in self.states: + raise AttributeError(f"specified initial state {self.initial_state} not in Enum of states {self.states}.") + + self._state = self._get_machine_compliant_state(self.initial_state) + self.owner = owner + owner_properties = owner.properties.descriptors.values() + owner_methods = [obj[0] for obj in inspect._getmembers(owner, inspect.ismethod, getattr_without_descriptor_read)] + + if isinstance(self.states, list): + self.states = tuple(self.states) # freeze the list of states + + # first validate machine + for state, objects in self.machine.items(): + if state in self: + for resource in objects: + if hasattr(resource, '_remote_info'): + assert isinstance(resource._remote_info, RemoteResourceInfoValidator) # type definition + if resource._remote_info.isaction and resource._remote_info.obj_name not in owner_methods: + raise AttributeError("Given object {} for state machine does not belong to class {}".format( + resource, owner)) + if resource._remote_info.isproperty and resource not in owner_properties: + raise AttributeError("Given object {} - {} for state machine does not belong to class {}".format( + resource.name, resource, owner)) + if resource._remote_info.state is None: + resource._remote_info.state = self._get_machine_compliant_state(state) + else: + resource._remote_info.state = resource._remote_info.state + (self._get_machine_compliant_state(state), ) + else: + raise AttributeError(f"Object {resource} was not made remotely accessible," + + " use state machine with properties and actions only.") + else: + raise AttributeError("Given state {} not in states Enum {}".format(state, self.states.__members__)) + + # then the callbacks + for state, objects in self.on_enter.items(): + if isinstance(objects, list): + self.on_enter[state] = tuple(objects) + elif not isinstance(objects, (list, tuple)): + self.on_enter[state] = (objects, ) + for obj in self.on_enter[state]: # type: ignore + if not isinstance(obj, (FunctionType, MethodType)): + raise TypeError(f"on_enter accept only methods. Given type {type(obj)}.") + + for state, objects in self.on_exit.items(): + if isinstance(objects, list): + self.on_exit[state] = tuple(objects) # type: ignore + elif not isinstance(objects, (list, tuple)): + self.on_exit[state] = (objects, ) # type: ignore + for obj in self.on_exit[state]: # type: ignore + if not isinstance(obj, (FunctionType, MethodType)): + raise TypeError(f"on_enter accept only methods. Given type {type(obj)}.") + self._valid = True + + def __contains__(self, state : typing.Union[str, StrEnum]): + if isinstance(self.states, EnumMeta) and state in self.states.__members__: + return True + elif isinstance(self.states, tuple) and state in self.states: + return True + return False + + def _get_machine_compliant_state(self, state) -> typing.Union[StrEnum, str]: + """ + In case of not using StrEnum or iterable of str, + this maps the enum of state to the state name. + """ + if isinstance(state, str): + return state + if isinstance(state, Enum): + return state.name + raise TypeError(f"cannot comply state to a string : {state} which is of type {type(state)}.") + + + def get_state(self) -> typing.Union[str, StrEnum, None]: + """ + return the current state. one can also access the property `current state`. + + Returns + ------- + current state: str + """ + return self._state + + def set_state(self, value : typing.Union[str, StrEnum, Enum], push_event : bool = True, + skip_callbacks : bool = False) -> None: + """ + set state of state machine. Also triggers state change callbacks if skip_callbacks=False and pushes a state + change event when push_event=True. One can also set state using '=' operator of `current_state` property in which case + callbacks will be called. If originally an enumeration for the list of allowed states was supplied, + then an enumeration member must be used to set state. If a list of strings were supplied, then a string is accepted. + + Raises + ------ + ValueError: + if the state is not found in the allowed states + """ + + if value in self.states: + previous_state = self._state + self._state = self._get_machine_compliant_state(value) + if push_event and self.state_change_event is not None and self.state_change_event.publisher is not None: + self.state_change_event.push(value) + if skip_callbacks: + return + if previous_state in self.on_exit: + for func in self.on_exit[previous_state]: + func(self.owner) + if self._state in self.on_enter: + for func in self.on_enter[self._state]: + func(self.owner) + else: + raise ValueError("given state '{}' not in set of allowed states : {}.".format(value, self.states)) + + current_state = property(get_state, set_state, None, + doc = """read and write current state of the state machine""") + + def has_object(self, object : typing.Union[Property, typing.Callable]) -> bool: + """ + returns True if specified object is found in any of the state machine states. + Supply unbound method for checking methods, as state machine is specified at class level + when the methods are unbound. + """ + for objects in self.machine.values(): + if object in objects: + return True + return False + + diff --git a/hololinked/server/td.py b/hololinked/server/td.py new file mode 100644 index 0000000..6279633 --- /dev/null +++ b/hololinked/server/td.py @@ -0,0 +1,720 @@ +import typing +import socket +from dataclasses import dataclass, field + +from .data_classes import RemoteResourceInfoValidator +from .properties import * +from .constants import JSONSerializable +from .thing import Thing +from .properties import Property +from .events import Event + + + +@dataclass +class Schema: + """ + Base dataclass for all WoT schema; Implements a custom asdict method which replaces dataclasses' asdict + utility function + """ + + skip_keys = [] # override this to skip some dataclass attributes in the schema + + replacement_keys = { + 'context' : '@context', + 'htv_methodName' : 'htv:methodName' + } + + def asdict(self): + """dataclass fields as dictionary skip_keys and replacement_keys accounted""" + schema = dict() + for field, value in self.__dataclass_fields__.items(): + if getattr(self, field, NotImplemented) is NotImplemented or field in self.skip_keys: + continue + if field in self.replacement_keys: + schema[self.replacement_keys[field]] = getattr(self, field) + else: + schema[field] = getattr(self, field) + return schema + + @classmethod + def format_doc(cls, doc : str): + """strip tabs, newlines, whitespaces etc.""" + doc_as_list = doc.split('\n') + final_doc = [] + for line in doc_as_list: + line = line.lstrip('\n').rstrip('\n') + line = line.lstrip('\t').rstrip('\t') + line = line.lstrip('\n').rstrip('\n') + line = line.lstrip().rstrip() + final_doc.append(line) + return ''.join(final_doc) + + + +class JSONSchema: + """type restrictor converting python types to JSON schema types""" + + _allowed_types = ('string', 'object', 'array', 'number', 'integer', 'boolean', None) + + _replacements = { + int : 'integer', + float : 'number', + str : 'string', + bool : 'boolean', + dict : 'object', + list : 'array', + tuple : 'array', + type(None) : 'null' + } + + _schemas = { + + } + + @classmethod + def is_allowed_type(cls, type : typing.Any) -> bool: + if type in JSONSchema._replacements.keys(): + return True + return False + + @classmethod + def get_type(cls, typ : typing.Any) -> str: + if not JSONSchema.is_allowed_type(typ): + raise TypeError(f"Object for wot-td has invalid type for JSON conversion. Given type - {type(typ)}. " + + "Use JSONSchema.register_replacements on hololinked.wot.td.JSONSchema object to recognise the type.") + return JSONSchema._replacements[typ] + + @classmethod + def register_type_replacement(self, type : typing.Any, json_schema_type : str, + schema : typing.Optional[typing.Dict[str, JSONSerializable]] = None) -> None: + """ + specify a python type as a JSON type. + schema only supported for array and objects. + """ + if json_schema_type in JSONSchema._allowed_types: + JSONSchema._replacements[type] = json_schema_type + if schema is not None: + if json_schema_type not in ('array', 'object'): + raise ValueError(f"schemas support only for array and object JSON schema types, your specified type - {type}.") + JSONSchema._schemas[type] = schema + else: + raise TypeError(f"json schema replacement type must be one of allowed type - 'string', 'object', 'array', 'string', " + + f"'number', 'integer', 'boolean', 'null'. Given value {json_schema_type}") + + @classmethod + def is_supported(cls, typ: typing.Any) -> bool: + if typ in JSONSchema._schemas.keys(): + return True + return False + + @classmethod + def get(cls, typ : typing.Any): + """schema for array and objects only supported""" + if not JSONSchema.is_supported(typ): + raise ValueError(f"Schema for {typ} not provided. register one with JSONSchema.register_type_replacement()") + return JSONSchema._schemas[typ] + + + +@dataclass +class InteractionAffordance(Schema): + """ + implements schema common to all interaction affordances. + concepts - https://www.w3.org/TR/wot-thing-description11/#interactionaffordance + """ + title : str + titles : typing.Optional[typing.Dict[str, str]] + description : str + descriptions : typing.Optional[typing.Dict[str, str]] + forms : typing.List["Form"] + # uri variables + + def __init__(self): + super().__init__() + + + +@dataclass +class DataSchema(Schema): + """ + implementes Dataschema attributes. + https://www.w3.org/TR/wot-thing-description11/#sec-data-schema-vocabulary-definition + """ + title : str + titles : typing.Optional[typing.Dict[str, str]] + description : str + descriptions : typing.Optional[typing.Dict[str, str]] + const : bool + default : typing.Optional[typing.Any] + readOnly : bool + writeOnly : bool # write only are to be considered actions with no return value + format : typing.Optional[str] + unit : typing.Optional[str] + type : str + oneOf : typing.Optional[typing.List[typing.Dict[str, JSONSerializable]]] + enum : typing.Optional[typing.List[typing.Any]] + + def __init__(self): + super().__init__() + + def build(self, property : Property, owner : Thing, authority : str) -> None: + """generates the schema""" + self.title = property.name # or property.label + if property.constant: + self.const = property.constant + if property.readonly: + self.readOnly = property.readonly + if property.fget is None: + self.default = property.default + if property.doc: + self.description = Schema.format_doc(property.doc) + if property.metadata and property.metadata.get("unit", None) is not None: + self.unit = property.metadata["unit"] + # if property.allow_None: + # if not hasattr(self, 'oneOf'): + # self.oneOf = [] + # if hasattr(self, 'type'): + # self.oneOf.append(dict(type=self.type)) + # del self.type + # if not any(types["type"] == None for types in self.oneOf): + # self.oneOf.append(dict(type=None)) + + + +@dataclass +class PropertyAffordance(InteractionAffordance, DataSchema): + """ + creates property affordance schema from ``property`` descriptor object + schema - https://www.w3.org/TR/wot-thing-description11/#propertyaffordance + """ + observable : bool + + _custom_schema_generators = dict() + + def __init__(self): + super().__init__() + + def build(self, property : Property, owner : Thing, authority : str) -> None: + """generates the schema""" + DataSchema.build(self, property, owner, authority) + + if property.observable: + self.observable = property.observable + + self.forms = [] + for index, method in enumerate(property._remote_info.http_method): + form = Form() + # index is the order for http methods for (get, set, delete), generally (GET, PUT, DELETE) + if (index == 1 and property.readonly) or index >= 2: + continue # delete property is not a part of WoT, we also mostly never use it so ignore. + elif index == 0: + form.op = 'readproperty' + elif index == 1: + form.op = 'writeproperty' + form.href = f"{authority}{owner._full_URL_path_prefix}{property._remote_info.URL_path}" + form.htv_methodName = method.upper() + self.forms.append(form.asdict()) + + @classmethod + def generate_schema(self, property : Property, owner : Thing, authority : str) -> typing.Dict[str, JSONSerializable]: + if not isinstance(property, Property): + raise TypeError(f"Property affordance schema can only be generated for Property. " + f"Given type {type(property)}") + if isinstance(property, (String, Filename, Foldername, Path)): + schema = StringSchema() + elif isinstance(property, (Number, Integer)): + schema = NumberSchema() + elif isinstance(property, Boolean): + schema = BooleanSchema() + elif isinstance(property, (List, TypedList, Tuple, TupleSelector)): + schema = ArraySchema() + elif isinstance(property, Selector): + schema = EnumSchema() + elif isinstance(property, (TypedDict, TypedKeyMappingsDict)): + schema = ObjectSchema() + elif isinstance(property, ClassSelector): + schema = OneOfSchema() + elif self._custom_schema_generators.get(property, NotImplemented) is not NotImplemented: + schema = self._custom_schema_generators[property]() + else: + raise TypeError(f"WoT schema generator for this descriptor/property is not implemented. type {type(property)}") + schema.build(property=property, owner=owner, authority=authority) + return schema.asdict() + + @classmethod + def register_descriptor(cls, descriptor : Property, schema_generator : "PropertyAffordance") -> None: + if not isinstance(descriptor, Property): + raise TypeError("custom schema generator can also be registered for Property." + + f" Given type {type(descriptor)}") + if not isinstance(schema_generator, PropertyAffordance): + raise TypeError("schema generator for Property must be subclass of PropertyAfforance. " + + f"Given type {type(schema_generator)}" ) + PropertyAffordance._custom_schema_generators[descriptor] = schema_generator + + +@dataclass +class BooleanSchema(PropertyAffordance): + """ + boolean schema - https://www.w3.org/TR/wot-thing-description11/#booleanschema + used by Boolean descriptor + """ + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + self.type = 'boolean' + PropertyAffordance.build(self, property, owner, authority) + + +@dataclass +class StringSchema(PropertyAffordance): + """ + string schema - https://www.w3.org/TR/wot-thing-description11/#stringschema + used by String, Filename, Foldername, Path descriptors + """ + pattern : typing.Optional[str] + + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + self.type = 'string' + PropertyAffordance.build(self, property, owner, authority) + if isinstance(property, String): + if property.regex is not None: + self.pattern = property.regex + + +@dataclass +class NumberSchema(PropertyAffordance): + """ + number schema - https://www.w3.org/TR/wot-thing-description11/#numberschema + used by String, Filename, Foldername, Path descriptors + """ + minimum : typing.Optional[typing.Union[int, float]] + maximum : typing.Optional[typing.Union[int, float]] + exclusiveMinimum : typing.Optional[bool] + exclusiveMaximum : typing.Optional[bool] + step : typing.Optional[typing.Union[int, float]] + + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + if isinstance(property, Integer): + self.type = 'integer' + elif isinstance(property, Number): # dont change order - one is subclass of other + self.type = 'number' + PropertyAffordance.build(self, property, owner, authority) + if property.bounds is not None: + if isinstance(property.bounds[0], (int, float)): # i.e. value is not None which is allowed by param + if not property.inclusive_bounds[0]: + self.exclusiveMinimum = property.bounds[0] + else: + self.minimum = property.bounds[0] + if isinstance(property.bounds[1], (int, float)): + if not property.inclusive_bounds[1]: + self.exclusiveMaximum = property.bounds[1] + else: + self.maximum = property.bounds[1] + if property.step: + self.multipleOf = property.step + + +@dataclass +class ArraySchema(PropertyAffordance): + """ + array schema - https://www.w3.org/TR/wot-thing-description11/#arrayschema + Used by List, Tuple, TypedList and TupleSelector + """ + + items : typing.Optional[typing.Dict[str, JSONSerializable]] + minItems : typing.Optional[int] + maxItems : typing.Optional[int] + + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + self.type = 'array' + PropertyAffordance.build(self, property, owner, authority) + self.items = [] + if isinstance(property, (List, Tuple, TypedList)) and property.item_type is not None: + if property.bounds: + if property.bounds[0]: + self.minItems = property.bounds[0] + if property.bounds[1]: + self.maxItems = property.bounds[1] + if isinstance(property.item_type, (list, tuple)): + for typ in property.item_type: + self.items.append(dict(type=JSONSchema.get_type(typ))) + else: + self.items.append(dict(type=JSONSchema.get_type(property.item_type))) + elif isinstance(property, TupleSelector): + objects = list(property.objects) + for obj in objects: + if any(types["type"] == JSONSchema._replacements.get(type(obj), None) for types in self.items): + continue + self.items.append(dict(type=JSONSchema.get_type(type(obj)))) + if len(self.items) > 1: + self.items = dict(oneOf=self.items) + + +@dataclass +class ObjectSchema(PropertyAffordance): + """ + object schema - https://www.w3.org/TR/wot-thing-description11/#objectschema + Used by TypedDict + """ + properties : typing.Optional[typing.Dict[str, JSONSerializable]] + required : typing.Optional[typing.List[str]] + + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + PropertyAffordance.build(self, property, owner, authority) + properties = None + required = None + if hasattr(property, 'json_schema'): + # Code will not reach here for now as have not implemented schema for typed dictionaries. + properties = property.json_schema["properties"] + if property.json_schema.get("required", NotImplemented) is not NotImplemented: + required = property.json_schema["required"] + if not property.allow_None: + self.type = 'object' + if properties: + self.properties = properties + if required: + self.required = required + else: + schema = dict(type='object') + if properties: + schema['properties'] = properties + if required: + schema['required'] = required + self.oneOf.append(schema) + + +@dataclass +class OneOfSchema(PropertyAffordance): + """ + custom schema to deal with ClassSelector to fill oneOf field correctly + https://www.w3.org/TR/wot-thing-description11/#dataschema + """ + properties : typing.Optional[typing.Dict[str, JSONSerializable]] + required : typing.Optional[typing.List[str]] + items : typing.Optional[typing.Dict[str, JSONSerializable]] + minItems : typing.Optional[int] + maxItems : typing.Optional[int] + # ClassSelector can technically have a JSON serializable as a class_ + + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + self.oneOf = [] + if isinstance(property, ClassSelector): + if not property.isinstance: + raise NotImplementedError("WoT TD for ClassSelector with isinstance set to True is not supported yet. " + + "Consider user this property in a different way.") + if isinstance(property.class_, (list, tuple)): + objects = list(property.class_) + else: + objects = [property.class_] + elif isinstance(property, Selector): + objects = list(property.objects) + else: + raise TypeError(f"EnumSchema and OneOfSchema supported only for Selector and ClassSelector. Given Type - {property}") + for obj in objects: + if any(types["type"] == JSONSchema._replacements.get(type(obj), None) for types in self.oneOf): + continue + if isinstance(property, ClassSelector): + if not JSONSchema.is_allowed_type(obj): + raise TypeError(f"Object for wot-td has invalid type for JSON conversion. Given type - {obj}. " + + "Use JSONSchema.register_replacements on hololinked.wot.td.JSONSchema object to recognise the type.") + subschema = dict(type=JSONSchema.get_type(obj)) + if JSONSchema.is_supported(obj): + subschema.update(JSONSchema.get(obj)) + self.oneOf.append(subschema) + elif isinstance(property, Selector): + if JSONSchema.get_type(type(obj)) == "null": + continue + self.oneOf.append(dict(type=JSONSchema.get_type(type(obj)))) + PropertyAffordance.build(self, property, owner, authority) + self.cleanup() + + def cleanup(self): + if len(self.oneOf) == 1: + oneOf = self.oneOf[0] + self.type = oneOf["type"] + if oneOf["type"] == 'object': + if oneOf.get("properites", NotImplemented) is not NotImplemented: + self.properties = oneOf["properties"] + if oneOf.get("required", NotImplemented) is not NotImplemented: + self.required = oneOf["required"] + elif oneOf["type"] == 'array': + if oneOf.get("items", NotImplemented) is not NotImplemented: + self.items = oneOf["items"] + if oneOf.get("maxItems", NotImplemented) is not NotImplemented: + self.minItems = oneOf["minItems"] + if oneOf.get("maxItems", NotImplemented) is not NotImplemented: + self.maxItems = oneOf["maxItems"] + del self.oneOf + + +class EnumSchema(OneOfSchema): + """ + custom schema to fill enum field of property affordance correctly + https://www.w3.org/TR/wot-thing-description11/#dataschema + """ + def __init__(self): + super().__init__() + + def build(self, property: Property, owner: Thing, authority: str) -> None: + """generates the schema""" + assert isinstance(property, Selector), f"EnumSchema compatible property is only Selector, not {property.__class__}" + self.enum = list(property.objects) + OneOfSchema.build(self, property, owner, authority) + + + + + +class Link: + pass + + +@dataclass +class ExpectedResponse(Schema): + """ + Form property. + schema - https://www.w3.org/TR/wot-thing-description11/#expectedresponse + """ + contentType : str + + def __init__(self): + super().__init__() + +@dataclass +class AdditionalExpectedResponse(Schema): + """ + Form property. + schema - https://www.w3.org/TR/wot-thing-description11/#additionalexpectedresponse + """ + success : bool + contentType : str + schema : typing.Optional[typing.Dict[str, typing.Any]] + + def __init__(self): + super().__init__() + + +@dataclass +class Form(Schema): + """ + Form hypermedia. + schema - https://www.w3.org/TR/wot-thing-description11/#form + """ + href : str + contentEncoding : typing.Optional[str] + security : typing.Optional[str] + scopes : typing.Optional[str] + response : typing.Optional[ExpectedResponse] + additionalResponses : typing.Optional[typing.List[AdditionalExpectedResponse]] + subprotocol : typing.Optional[str] + op : str + htv_methodName : str + subprotocol : str + contentType : typing.Optional[str] = field(default='application/json') + + def __init__(self): + super().__init__() + + + +@dataclass +class ActionAffordance(InteractionAffordance): + """ + creates action affordance schema from actions (or methods). + schema - https://www.w3.org/TR/wot-thing-description11/#actionaffordance + """ + input : typing.Dict[str, JSONSerializable] + output : typing.Dict[str, JSONSerializable] + safe : bool + idempotent : bool + synchronous : bool + + def __init__(self): + super(InteractionAffordance, self).__init__() + + def build(self, action : typing.Callable, owner : Thing, authority : str) -> None: + assert isinstance(action._remote_info, RemoteResourceInfoValidator) + if action._remote_info.argument_schema: + self.input = action._remote_info.argument_schema + if action._remote_info.return_value_schema: + self.output = action._remote_info.return_value_schema + self.title = action.__name__ + if action.__doc__: + self.description = self.format_doc(action.__doc__) + self.safe = True + if (hasattr(owner, 'state_machine') and owner.state_machine is not None and + owner.state_machine.has_object(action._remote_info.obj)): + self.idempotent = False + else: + self.idempotent = True + self.synchronous = True + self.forms = [] + for method in action._remote_info.http_method: + form = Form() + form.op = 'invokeaction' + form.href = f'{authority}{owner._full_URL_path_prefix}{action._remote_info.URL_path}' + form.htv_methodName = method.upper() + self.forms.append(form.asdict()) + + @classmethod + def generate_schema(cls, action : typing.Callable, owner : Thing, authority : str) -> typing.Dict[str, JSONSerializable]: + schema = ActionAffordance() + schema.build(action=action, owner=owner, authority=authority) + return schema.asdict() + + +@dataclass +class EventAffordance(InteractionAffordance): + """ + creates event affordance schema from events. + schema - https://www.w3.org/TR/wot-thing-description11/#eventaffordance + """ + subscription : str + data : typing.Dict[str, JSONSerializable] + + def __init__(self): + super().__init__() + + def build(self, event : Event, owner : Thing, authority : str) -> None: + form = Form() + form.op = "subscribeevent" + form.href = f"{authority}{owner._full_URL_path_prefix}{event.URL_path}" + form.contentType = "text/event-stream" + form.htv_methodName = "GET" + form.subprotocol = "sse" + self.forms = [form.asdict()] + + @classmethod + def generate_schema(cls, event : Event, owner : Thing, authority : str) -> typing.Dict[str, JSONSerializable]: + schema = EventAffordance() + schema.build(event=event, owner=owner, authority=authority) + return schema.asdict() + + +@dataclass +class VersionInfo: + """ + create version info. + schema - https://www.w3.org/TR/wot-thing-description11/#versioninfo + """ + instance : str + model : str + + +@dataclass +class SecurityScheme(Schema): + """ + create security scheme. + schema - https://www.w3.org/TR/wot-thing-description11/#sec-security-vocabulary-definition + """ + scheme: str + description : str + descriptions : typing.Optional[typing.Dict[str, str]] + proxy : typing.Optional[str] + + def __init__(self): + super().__init__() + + def build(self, name : str, instance): + self.scheme = 'nosec' + self.description = 'currently no security scheme supported - use cookie auth directly on hololinked.server.HTTPServer object' + return { name : self.asdict() } + + + +@dataclass +class ThingDescription(Schema): + """ + generate Thing Description schema of W3 Web of Things standard. + Refer standard - https://www.w3.org/TR/wot-thing-description11 + Refer schema - https://www.w3.org/TR/wot-thing-description11/#thing + """ + context : typing.Union[typing.List[str], str, typing.Dict[str, str]] + type : typing.Optional[typing.Union[str, typing.List[str]]] + id : str + title : str + titles : typing.Optional[typing.Dict[str, str]] + description : str + descriptions : typing.Optional[typing.Dict[str, str]] + version : typing.Optional[VersionInfo] + created : typing.Optional[str] + modified : typing.Optional[str] + support : typing.Optional[str] + base : typing.Optional[str] + properties : typing.List[PropertyAffordance] + actions : typing.List[ActionAffordance] + events : typing.List[EventAffordance] + links : typing.Optional[typing.List[Link]] + forms : typing.Optional[typing.List[Form]] + security : typing.Union[str, typing.List[str]] + securityDefinitions : SecurityScheme + + skip_properties = ['expose', 'httpserver_resources', 'rpc_resources', 'gui_resources', + 'events', 'debug_logs', 'warn_logs', 'info_logs', 'error_logs', 'critical_logs', + 'thing_description', 'maxlen', 'execution_logs', 'GUI', 'object_info' ] + + skip_actions = ['_set_properties', '_get_properties', 'push_events', 'stop_events', + 'postman_collection'] + + def __init__(self): + super().__init__() + + def build(self, instance : Thing, authority = f"https://{socket.gethostname()}:8080", + allow_loose_schema : typing.Optional[bool] = False) -> typing.Dict[str, typing.Any]: + self.context = "https://www.w3.org/2022/wot/td/v1.1" + self.id = f"{authority}/{instance.instance_name}" + self.title = instance.__class__.__name__ + self.description = Schema.format_doc(instance.__doc__) if instance.__doc__ else "no class doc provided" + self.properties = dict() + self.actions = dict() + self.events = dict() + + # properties and actions + for resource in instance.instance_resources.values(): + if (resource.isproperty and resource.obj_name not in self.properties and + resource.obj_name not in self.skip_properties and hasattr(resource.obj, "_remote_info") and + resource.obj._remote_info is not None): + self.properties[resource.obj_name] = PropertyAffordance.generate_schema(resource.obj, instance, authority) + elif (resource.isaction and resource.obj_name not in self.actions and + resource.obj_name not in self.skip_actions and hasattr(resource.obj, '_remote_info')): + self.actions[resource.obj_name] = ActionAffordance.generate_schema(resource.obj, instance, authority) + # Events + for name, resource in vars(instance).items(): + if not isinstance(resource, Event): + continue + self.events[name] = EventAffordance.generate_schema(resource, instance, authority) + + self.security = 'unimplemented' + self.securityDefinitions = SecurityScheme().build('unimplemented', instance) + + return self.asdict() + + + +__all__ = [ + ThingDescription.__name__, + JSONSchema.__name__ +] \ No newline at end of file diff --git a/hololinked/server/thing.py b/hololinked/server/thing.py new file mode 100644 index 0000000..e137ceb --- /dev/null +++ b/hololinked/server/thing.py @@ -0,0 +1,575 @@ +import logging +import inspect +import os +import ssl +import typing +import warnings +import zmq + +from ..param.parameterized import Parameterized, ParameterizedMetaclass +from .constants import (LOGLEVEL, ZMQ_PROTOCOLS, HTTP_METHODS) +from .database import ThingDB, ThingInformation +from .serializers import _get_serializer_from_user_given_options, BaseSerializer, JSONSerializer +from .exceptions import BreakInnerLoop +from .action import action +from .data_classes import GUIResources, HTTPResource, RPCResource, get_organised_resources +from .utils import get_default_logger, getattr_without_descriptor_read +from .property import Property, ClassProperties +from .properties import String, ClassSelector, Selector, TypedKeyMappingsConstrainedDict +from .zmq_message_brokers import RPCServer, ServerTypes, AsyncPollingZMQServer, EventPublisher +from .state_machine import StateMachine +from .events import Event + + + +class ThingMeta(ParameterizedMetaclass): + """ + Metaclass for Thing, implements a ``__post_init__()`` call and instantiation of a container for properties' descriptor + objects. During instantiation of ``Thing``, first serializers, loggers and database connection are created, after which + the user ``__init__`` is called. In ``__post_init__()``, that runs after user's ``__init__()``, the exposed resources + are segregated while accounting for any ``Event`` objects or instance specific properties created during init. Properties + are also loaded from database at this time. One can overload ``__post_init__()`` for any operations that rely on properties + values loaded from database. + """ + + @classmethod + def __prepare__(cls, name, bases): + return TypedKeyMappingsConstrainedDict({}, + type_mapping = dict( + state_machine = (StateMachine, type(None)), + instance_name = String, + log_level = Selector, + logger = ClassSelector, + logfile = String, + db_config_file = String, + object_info = Property, # it should not be set by the user + ), + allow_unspecified_keys = True + ) + + def __new__(cls, __name, __bases, __dict : TypedKeyMappingsConstrainedDict): + return super().__new__(cls, __name, __bases, __dict._inner) + + def __call__(mcls, *args, **kwargs): + instance = super().__call__(*args, **kwargs) + instance.__post_init__() + return instance + + def _create_param_container(mcs, mcs_members : dict) -> None: + """ + creates ``ClassProperties`` instead of ``param``'s own ``Parameters`` + as the default container for descriptors. All properties have definitions + copied from ``param``. + """ + mcs._param_container = ClassProperties(mcs, mcs_members) + + @property + def properties(mcs) -> ClassProperties: + """ + returns ``ClassProperties`` instance instead of ``param``'s own + ``Parameters`` instance. See code of ``param``. + """ + return mcs._param_container + + + +class Thing(Parameterized, metaclass=ThingMeta): + """ + Subclass from here to expose python objects on the network (with HTTP/TCP) or to other processes (ZeroMQ) + """ + + __server_type__ = ServerTypes.THING + + # local properties + instance_name = String(default=None, regex=r'[A-Za-z]+[A-Za-z_0-9\-\/]*', constant=True, remote=False, + doc="""Unique string identifier of the instance. This value is used for many operations, + for example - creating zmq socket address, tables in databases, and to identify the instance + in the HTTP Server - (http(s)://{domain and sub domain}/{instance name}). + If creating a big system, instance names are recommended to be unique.""") # type: str + logger = ClassSelector(class_=logging.Logger, default=None, allow_None=True, remote=False, + doc="""logging.Logger instance to print log messages. Default + logger with a IO-stream handler and network accessible handler is created + if none supplied.""") # type: logging.Logger + rpc_serializer = ClassSelector(class_=(BaseSerializer, str), + allow_None=True, default='json', remote=False, + doc="""Serializer used for exchanging messages with python RPC clients. Subclass the base serializer + or one of the available serializers to implement your own serialization requirements; or, register + type replacements. Default is JSON. Some serializers like MessagePack improve performance many times + compared to JSON and can be useful for data intensive applications within python.""") # type: BaseSerializer + json_serializer = ClassSelector(class_=(JSONSerializer, str), default=None, allow_None=True, remote=False, + doc="""Serializer used for exchanging messages with a HTTP clients, + subclass JSONSerializer to implement your own JSON serialization requirements; or, + register type replacements. Other types of serializers are currently not allowed for HTTP clients.""") # type: JSONSerializer + + # remote paramerters + state = String(default=None, allow_None=True, URL_path='/state', readonly=True, + fget= lambda self : self.state_machine.current_state if hasattr(self, 'state_machine') else None, + doc="current state machine's state if state machine present, None indicates absence of state machine.") #type: typing.Optional[str] + + httpserver_resources = Property(readonly=True, URL_path='/resources/http-server', + doc="object's resources exposed to HTTP client (through ``hololinked.server.HTTPServer.HTTPServer``)", + fget=lambda self: self._httpserver_resources ) # type: typing.Dict[str, HTTPResource] + rpc_resources = Property(readonly=True, URL_path='/resources/object-proxy', + doc="object's resources exposed to RPC client, similar to HTTP resources but differs in details.", + fget=lambda self: self._rpc_resources) # type: typing.Dict[str, RPCResource] + gui_resources = Property(readonly=True, URL_path='/resources/portal-app', + doc="""object's data read by hololinked-portal GUI client, similar to http_resources but differs + in details.""", + fget=lambda self: GUIResources().build(self)) # type: typing.Dict[str, typing.Any] + GUI = Property(default=None, allow_None=True, URL_path='/resources/web-gui', fget = lambda self : self._gui, + doc="GUI specified here will become visible at GUI tab of hololinked-portal dashboard tool") + object_info = Property(doc="contains information about this object like the class name, script location etc.", + URL_path='/object-info') # type: ThingInformation + + + def __new__(cls, *args, **kwargs): + obj = super().__new__(cls) + # defines some internal fixed attributes. attributes created by us that require no validation but + # cannot be modified are called _internal_fixed_attributes + obj._internal_fixed_attributes = ['_internal_fixed_attributes', 'instance_resources', + '_httpserver_resources', '_rpc_resources', '_owner'] + return obj + + + def __init__(self, *, instance_name : str, logger : typing.Optional[logging.Logger] = None, + serializer : typing.Optional[JSONSerializer] = None, **kwargs) -> None: + """ + Parameters + ---------- + instance_name: str + Unique string identifier of the instance. This value is used for many operations, + for example - creating zmq socket address, tables in databases, and to identify the instance in the HTTP Server - + (http(s)://{domain and sub domain}/{instance name}). + If creating a big system, instance names are recommended to be unique. + logger: logging.Logger, optional + logging.Logger instance to print log messages. Default logger with a IO-stream handler and network + accessible handler is created if none supplied. + serializer: JSONSerializer, optional + custom JSON serializer. To use separate serializer for python RPC clients and cross-platform + HTTP clients, use keyword arguments rpc_serializer and json_serializer and leave this argument at None. + **kwargs: + rpc_serializer: BaseSerializer | str, optional + Serializer used for exchanging messages with python RPC clients. If string value is supplied, + supported are 'msgpack', 'pickle', 'serpent', 'json'. Subclass the base serializer + ``hololinked.server.serializer.BaseSerializer`` or one of the available serializers to implement your + own serialization requirements; or, register type replacements. Default is JSON. Some serializers like + MessagePack improve performance many times compared to JSON and can be useful for data intensive + applications within python. The serializer supplied here must also be supplied to object proxy from + ``hololinked.client``. + json_serializer: JSONSerializer, optional + serializer used for cross platform HTTP clients. + use_default_db: bool, Default False + if True, default SQLite database is created where properties can be stored and loaded. There is no need to supply + any database credentials. This value can also be set as a class attribute, see docs. + logger_remote_access: bool, Default True + if False, network accessible handler is not attached to the logger. This value can also be set as a + class attribute, see docs. + db_config_file: str, optional + if not using a default database, supply a JSON configuration file to create a connection. Check documentaion + of ``hololinked.server.database``. + + """ + if instance_name.startswith('/'): + instance_name = instance_name[1:] + if not isinstance(serializer, JSONSerializer) and serializer != 'json' and serializer is not None: + raise TypeError("serializer key word argument must be JSONSerializer. If one wishes to use separate serializers " + + "for python clients and HTTP clients, use rpc_serializer and json_serializer keyword arguments.") + rpc_serializer = serializer or kwargs.pop('rpc_serializer', 'json') + json_serializer = serializer if isinstance(serializer, JSONSerializer) else kwargs.pop('json_serializer', 'json') + rpc_serializer, json_serializer = _get_serializer_from_user_given_options( + rpc_serializer=rpc_serializer, + json_serializer=json_serializer + ) + super().__init__(instance_name=instance_name, logger=logger, + rpc_serializer=rpc_serializer, json_serializer=json_serializer, **kwargs) + + self._prepare_logger( + log_level=kwargs.get('log_level', None), + log_file=kwargs.get('log_file', None), + remote_access=kwargs.get('logger_remote_access', self.__class__.logger_remote_access if hasattr( + self.__class__, 'logger_remote_access') else False) + ) + self._prepare_state_machine() + self._prepare_DB(kwargs.get('use_default_db', False), kwargs.get('db_config_file', None)) + + + def __post_init__(self): + self._owner : typing.Optional[Thing] = None + self._internal_fixed_attributes : typing.List[str] + self._full_URL_path_prefix : str + self.rpc_server : typing.Optional[RPCServer] + self.message_broker : typing.Optional[AsyncPollingZMQServer] + self._event_publisher : typing.Optional[EventPublisher] + self._gui = None # filler for a future feature + self._prepare_resources() + self.load_properties_from_DB() + self.logger.info(f"initialialised Thing class {self.__class__.__name__} with instance name {self.instance_name}") + + + @property + def properties(self): + return self.parameters + + + def __setattr__(self, __name: str, __value: typing.Any) -> None: + if __name == '_internal_fixed_attributes' or __name in self._internal_fixed_attributes: + # order of 'or' operation for above 'if' matters + if not hasattr(self, __name) or getattr(self, __name, None) is None: + # allow setting of fixed attributes once + super().__setattr__(__name, __value) + else: + raise AttributeError(f"Attempted to set {__name} more than once. " + + "Cannot assign a value to this variable after creation.") + else: + super().__setattr__(__name, __value) + + + def _prepare_resources(self): + """ + this method analyses the members of the class which have '_remote_info' variable declared + and extracts information necessary to make RPC functionality work. + """ + # The following dict is to be given to the HTTP server + self._rpc_resources, self._httpserver_resources, self.instance_resources = get_organised_resources(self) + + + def _prepare_logger(self, log_level : int, log_file : str, remote_access : bool = False): + from .logger import RemoteAccessHandler + if self.logger is None: + self.logger = get_default_logger(self.instance_name, + logging.INFO if not log_level else log_level, + None if not log_file else log_file) + if remote_access: + if not any(isinstance(handler, RemoteAccessHandler) for handler in self.logger.handlers): + self._remote_access_loghandler = RemoteAccessHandler(instance_name='logger', + maxlen=500, emit_interval=1, logger=self.logger) + # thing has its own logger so we dont recreate one for + # remote access handler + self.logger.addHandler(self._remote_access_loghandler) + + if not isinstance(self, logging.Logger): + for handler in self.logger.handlers: + # if remote access is True or not, if a default handler is found make a variable for it anyway + if isinstance(handler, RemoteAccessHandler): + self._remote_access_loghandler = handler + + + def _prepare_state_machine(self): + if hasattr(self, 'state_machine'): + self.state_machine._prepare(self) + self.logger.debug("setup state machine") + + + def _prepare_DB(self, default_db : bool = False, config_file : str = None): + if not default_db and not config_file: + self.object_info + return + # 1. create engine + self.db_engine = ThingDB(instance=self, config_file=None if default_db else config_file, + serializer=self.rpc_serializer) # type: ThingDB + # 2. create an object metadata to be used by different types of clients + object_info = self.db_engine.fetch_own_info() + if object_info is not None: + self._object_info = object_info + # 3. enter properties to DB if not already present + if self.object_info.class_name != self.__class__.__name__: + raise ValueError("Fetched instance name and class name from database not matching with the ", + " current Thing class/subclass. You might be reusing an instance name of another subclass ", + "and did not remove the old data from database. Please clean the database using database tools to ", + "start fresh.") + + + @object_info.getter + def _get_object_info(self): + if not hasattr(self, '_object_info'): + self._object_info = ThingInformation( + instance_name = self.instance_name, + class_name = self.__class__.__name__, + script = os.path.dirname(os.path.abspath(inspect.getfile(self.__class__))), + http_server = "USER_MANAGED", + kwargs = "USER_MANAGED", + eventloop_instance_name = "USER_MANAGED", + level = "USER_MANAGED", + level_type = "USER_MANAGED" + ) + return self._object_info + + @object_info.setter + def _set_object_info(self, value): + self._object_info = ThingInformation(**value) + + + @action(URL_path='/properties', http_method=HTTP_METHODS.GET) + def _get_properties(self, **kwargs) -> typing.Dict[str, typing.Any]: + """ + """ + data = {} + if len(kwargs) == 0: + for parameter in self.properties.descriptors.keys(): + data[parameter] = self.properties[parameter].__get__(self, type(self)) + elif 'names' in kwargs: + names = kwargs.get('names') + if not isinstance(names, (list, tuple, str)): + raise TypeError(f"Specify properties to be fetched as a list, tuple or comma separated names. Givent type {type(names)}") + if isinstance(names, str): + names = names.split(',') + for requested_parameter in names: + if not isinstance(requested_parameter, str): + raise TypeError(f"parameter name must be a string. Given type {type(requested_parameter)}") + data[requested_parameter] = self.properties[requested_parameter].__get__(self, type(self)) + elif len(kwargs.keys()) != 0: + for rename, requested_parameter in kwargs.items(): + data[rename] = self.properties[requested_parameter].__get__(self, type(self)) + return data + + @action(URL_path='/properties', http_method=HTTP_METHODS.PATCH) + def _set_properties(self, **values : typing.Dict[str, typing.Any]) -> None: + """ + set properties whose name is specified by keys of a dictionary + + Parameters + ---------- + values: Dict[str, Any] + dictionary of parameter names and its values + """ + for name, value in values.items(): + setattr(self, name, value) + + + @property + def event_publisher(self) -> EventPublisher: + """ + event publishing PUB socket owning object, valid only after + ``run()`` is called, otherwise raises AttributeError. + """ + try: + return self._event_publisher + except AttributeError: + raise AttributeError("event publisher not yet created.") from None + + @event_publisher.setter + def event_publisher(self, value : EventPublisher) -> None: + if hasattr(self, '_event_publisher'): + raise AttributeError("Can set event publisher only once.") + + def recusively_set_event_publisher(obj : Thing, publisher : EventPublisher) -> None: + for name, evt in inspect._getmembers(obj, lambda o: isinstance(o, Event), getattr_without_descriptor_read): + assert isinstance(evt, Event), "object is not an event" + # above is type definition + evt.publisher = publisher + evt._remote_info.socket_address = publisher.socket_address + for prop in self.properties.descriptors.values(): + if prop.observable: + assert isinstance(prop._observable_event, Event), "observable event logic error in event_publisher set" + prop._observable_event.publisher = publisher + prop._observable_event._remote_info.socket_address = publisher.socket_address + if (hasattr(obj, 'state_machine') and isinstance(obj.state_machine, StateMachine) and + obj.state_machine.state_change_event is not None): + obj.state_machine.state_change_event.publisher = publisher + obj.state_machine.state_change_event._remote_info.socket_address = publisher.socket_address + for name, obj in inspect._getmembers(obj, lambda o: isinstance(o, Thing), getattr_without_descriptor_read): + if name == '_owner': + continue + recusively_set_event_publisher(obj, publisher) + obj._event_publisher = publisher + + recusively_set_event_publisher(self, value) + + + @action(URL_path='/properties/db-reload', http_method=HTTP_METHODS.POST) + def load_properties_from_DB(self): + """ + Load and apply parameter values which have ``db_init`` or ``db_persist`` + set to ``True`` from database + """ + if not hasattr(self, 'db_engine'): + return + for name, resource in inspect._getmembers(self, lambda o : isinstance(o, Thing), getattr_without_descriptor_read): + if name == '_owner': + continue + missing_properties = self.db_engine.create_missing_properties(self.__class__.properties.db_init_objects, + get_missing_properties=True) + # 4. read db_init and db_persist objects + for db_param in self.db_engine.get_all_properties(): + try: + if db_param.name not in missing_properties: + setattr(obj, db_param.name, db_param.value) # type: ignore + except Exception as ex: + self.logger.error(f"could not set attribute {db_param.name} due to error {str(ex)}") + + + @action(URL_path='/resources/postman-collection', http_method=HTTP_METHODS.GET) + def get_postman_collection(self, domain_prefix : str = None): + """ + organised postman collection for this object + """ + from .api_platform_utils import postman_collection + return postman_collection.build(instance=self, + domain_prefix=domain_prefix if domain_prefix is not None else self._object_info.http_server) + + @action(URL_path='/resources/wot-td', http_method=HTTP_METHODS.GET) + def get_thing_description(self, authority : typing.Optional[str] = None): + # allow_loose_schema : typing.Optional[bool] = False): + """ + generate thing description schema of Web of Things https://www.w3.org/TR/wot-thing-description11/. + one can use the node-wot as a client for the object with the generated schema + (https://github.com/eclipse-thingweb/node-wot). Other WoT related tools based on TD will be compatible. + Composed Things that are not the top level object is currently not supported. + + Parameters + ---------- + authority: str, optional + protocol with DNS or protocol with hostname+port, for example 'https://my-pc:8080' or + 'http://my-pc:9090' or 'https://IT-given-domain-name'. If absent, a value will be automatically + given using ``socket.gethostname()`` and the port at which the last HTTPServer (``hololinked.server.HTTPServer``) + attached to this object was running. + + Returns: + hololinked.wot.td.ThingDescription + represented as an object in python, gets automatically serialized to JSON when pushed out of the socket. + """ + # allow_loose_schema: bool, optional, Default False + # Experimental properties, actions or events for which schema was not given will be supplied with a suitable + # value for node-wot to ignore validation or claim the accessed value for complaint with the schema. + # In other words, schema validation will always pass. + from .td import ThingDescription + return ThingDescription().build(self, authority or self._object_info.http_server, + allow_loose_schema=False) #allow_loose_schema) + + + @action(URL_path='/exit', http_method=HTTP_METHODS.POST) + def exit(self) -> None: + """ + Exit the object without killing the eventloop that runs this object. If Thing was + started using the run() method, the eventloop is also killed. This method can + only be called remotely. + """ + if self._owner is None: + raise BreakInnerLoop + else: + warnings.warn("call exit on the top object, composed objects cannot exit the loop.", RuntimeWarning) + + + def run(self, + zmq_protocols : typing.Union[typing.Sequence[ZMQ_PROTOCOLS], + ZMQ_PROTOCOLS] = ZMQ_PROTOCOLS.IPC, + # expose_eventloop : bool = False, + **kwargs + ) -> None: + """ + Quick-start ``Thing`` server by creating a default eventloop & ZMQ servers. This + method is blocking until exit() is called. + + Parameters + ---------- + zmq_protocols: Sequence[ZMQ_PROTOCOLS] | ZMQ_Protocools, Default ZMQ_PROTOCOLS.IPC or "IPC" + zmq transport layers at which the object is exposed. + TCP - provides network access apart from HTTP - please supply a socket address additionally. + IPC - inter process communication - connection can be made from other processes running + locally within same computer. No client on the network will be able to contact the object using + this transport. INPROC - one main python process spawns several threads in one of which the ``Thing`` + the running. The object can be contacted by a client on another thread but neither from other processes + or the network. One may use more than one form of transport. All requests made will be anyway queued internally + irrespective of origin. + + **kwargs + tcp_socket_address: str, optional + socket_address for TCP access, for example: tcp://0.0.0.0:61234 + """ + # expose_eventloop: bool, False + # expose the associated Eventloop which executes the object. This is generally useful for remotely + # adding more objects to the same event loop. + # dont specify http server as a kwarg, as the other method run_with_http_server has to be used + + context = zmq.asyncio.Context() + self.rpc_server = RPCServer( + instance_name=self.instance_name, + server_type=self.__server_type__.value, + context=context, + protocols=zmq_protocols, + rpc_serializer=self.rpc_serializer, + json_serializer=self.json_serializer, + socket_address=kwargs.get('tcp_socket_address', None), + logger=self.logger + ) + self.message_broker = self.rpc_server.inner_inproc_server + self.event_publisher = self.rpc_server.event_publisher + + from .eventloop import EventLoop + self.event_loop = EventLoop( + instance_name=f'{self.instance_name}/eventloop', + things=[self], + logger=self.logger, + rpc_serializer=self.rpc_serializer, + json_serializer=self.json_serializer, + expose=False, # expose_eventloop + ) + + if kwargs.get('http_server', None): + from .HTTPServer import HTTPServer + httpserver = kwargs.pop('http_server') + assert isinstance(httpserver, HTTPServer) + httpserver._zmq_protocol = ZMQ_PROTOCOLS.INPROC + httpserver._zmq_socket_context = context + httpserver._zmq_event_context = self.event_publisher.context + assert httpserver.all_ok + httpserver.tornado_instance.listen(port=httpserver.port, address=httpserver.address) + self.event_loop.run() + + + def run_with_http_server(self, port : int = 8080, address : str = '0.0.0.0', + # host : str = None, + allowed_clients : typing.Union[str, typing.Iterable[str]] = None, + ssl_context : ssl.SSLContext = None, # protocol_version : int = 1, + # network_interface : str = 'Ethernet', + **kwargs): + """ + Quick-start ``Thing`` server by creating a default eventloop & servers. This + method is fully blocking. + + Parameters + ---------- + port: int + the port at which the HTTP server should be run (unique) + address: str + set custom IP address, default is localhost (0.0.0.0) + ssl_context: ssl.SSLContext | None + use it for highly customized SSL context to provide encrypted communication + allowed_clients + serves request and sets CORS only from these clients, other clients are rejected with 403. Unlike pure CORS + feature, the server resource is not even executed if the client is not an allowed client. + **kwargs, + certfile: str + alternative to SSL context, provide certificate file & key file to allow the server to create a SSL connection on its own + keyfile: str + alternative to SSL context, provide certificate file & key file to allow the server to create a SSL connection on its own + request_handler: RPCHandler + custom web request handler of your choice + event_handler: BaseHandler | EventHandler + custom event handler of your choice for handling events + """ + # network_interface: str + # Currently there is no logic to detect the IP addresss (as externally visible) correctly, therefore please + # send the network interface name to retrieve the IP. If a DNS server is present, you may leave this field + # host: str + # Host Server to subscribe to coordinate starting sequence of things & web GUI + + from .HTTPServer import HTTPServer + + http_server = HTTPServer( + [self.instance_name], logger=self.logger, serializer=self.json_serializer, + port=port, address=address, ssl_context=ssl_context, + allowed_clients=allowed_clients, + # network_interface=network_interface, + **kwargs, + ) + + self.run( + zmq_protocols=ZMQ_PROTOCOLS.INPROC, + http_server=http_server + ) + + + + + diff --git a/hololinked/server/utils.py b/hololinked/server/utils.py index 052cd5b..e57fdf4 100644 --- a/hololinked/server/utils.py +++ b/hololinked/server/utils.py @@ -1,204 +1,156 @@ -import datetime import sys -import uuid import logging import re import asyncio import inspect import typing -from typing import List -from ..param.exceptions import wrap_error_text as wrap_text - - - -def copy_parameters(src : str = 'D:/onedrive/desktop/dashboard/scada/scadapy/scadapy/param/parameters.py', - dst : str = 'D:/onedrive/desktop/dashboard/scada/scadapy/scadapy/server/remote_parameters.py') -> None: - - skip_classes = ['Infinity', 'resolve_path', 'normalize_path', 'BaseConstrainedList', 'TypeConstrainedList', - 'TypeConstrainedDict', 'TypedKeyMappingsConstrainedDict', 'Event'] - end_line = 'def hashable' - additional_imports = ['from ..param.parameters import (TypeConstrainedList, TypeConstrainedDict, abbreviate_paths,\n', - ' TypedKeyMappingsConstrainedDict, resolve_path, concrete_descendents, named_objs)\n' - 'from .remote_parameter import RemoteParameter\n', - 'from .constants import HTTP, PROXY, USE_OBJECT_NAME, GET, PUT'] - - def fetch_line() -> str: - with open(src, 'r') as file: - oldlines = file.readlines() - for line in oldlines: - yield line - - remote_init_kwargs = [ - '\t\t\tURL_path : str = USE_OBJECT_NAME, http_method : typing.Tuple[str, str] = (GET, PUT),\n', - '\t\t\tstate : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None,\n', - '\t\t\tdb_persist : bool = False, db_init : bool = False, db_commit : bool = False,\n' - '\t\t\taccess_type : str = (HTTP, PROXY),\n', - ] - - remote_super_init = [ - '\t\t\tURL_path=URL_path, http_method=http_method, state=state, db_persist=db_persist,\n', - '\t\t\tdb_init=db_init, db_commit=db_commit, access_type=access_type)\n' - ] - - common_linegen = fetch_line() - newlines = [] - - def skip_to_init_doc(): - for line in common_linegen: - if 'doc : typing.Optional[str] = None' in line: - return line - else: - newlines.append(line) - - def skip_to_super_init_end(): - for line in common_linegen: - if 'precedence=precedence)' in line: - return line - else: - newlines.append(line) - - def is_function(line : str) -> bool: - if 'def ' in line and 'self' not in line and 'cls' not in line and 'obj' not in line: - return True - return False - - def next_line_after_skip_class_or_function() -> str: - for line_ in common_linegen: - if ('class ' in line_ and ':' in line_) or is_function(line_): - return line_ - - def process_current_line(line : str): - newline = line - if 'import ' in line and 'parameterized ' in line: - newlines_ = [line.replace('from .parameterized', 'from ..param.parameterized').replace('ParamOverrides,', ''), - next(common_linegen).replace('ParameterizedFunction, descendents,', ''), - *additional_imports] - newlines.extend(newlines_) - return - elif 'from collections import OrderedDict' in line: - newlines.append('from enum import Enum\n') - elif 'from .utils' in line or 'from .exceptions' in line: - newline = line.replace('from .', 'from ..param.') - elif 'class ' in line and ':' in line and line.startswith('class'): - if '(Parameter):' in line: - newline = line.replace('(Parameter):', '(RemoteParameter):') - newlines.append(newline) - else: - classname_with_inheritance = line.split(' ', 1)[1][:-2] # [:-2] for removing colon - classname_without_inheritance = classname_with_inheritance.split('(', 1)[0] - if classname_without_inheritance in skip_classes: - newline = next_line_after_skip_class_or_function() - process_current_line(newline) - return - else: - newlines.append(line) - newline = skip_to_init_doc() - newlines.append(newline) - newlines.extend(remote_init_kwargs) - newline = skip_to_super_init_end() - if newline: - newline = newline.replace('precedence=precedence)', 'precedence=precedence,') - newlines.append(newline) - newlines.extend(remote_super_init) - return - elif 'Parameter.__init__' in line: - newline = line.replace('Parameter.__init__', 'RemoteParameter.__init__') - elif is_function(line): - newline = next_line_after_skip_class_or_function() - process_current_line(newline) - return - newlines.append(newline) - - - for line in common_linegen: - process_current_line(line) - if end_line in line: - newlines.pop() - break - - with open(dst, 'w') as file: - file.writelines(newlines) - - -def unique_id() -> bytes: - """ - uuid.uuid4() in bytes - """ - return bytes(str(uuid.uuid4()), encoding = 'utf-8') - - -def current_datetime_ms_str(): - """ - default: YYYY-mm-dd_HH:MM:SS.fff e.g. 2022-12-01 13:00:00.123 - This default is (almost) ISO8601 compatible and is natively recognized by Pandas - """ - curtime = datetime.datetime.now() - return curtime.strftime('%Y%m%d_%H%M%S') + '{:03d}'.format(int(curtime.microsecond /1000)) - - -def dashed_URL(word : str) -> str: +import asyncio +import types +import traceback +import typing +import ifaddr + + + +def get_IP_from_interface(interface_name : str = 'Ethernet', adapter_name = None) -> str: + """ + Get IP address of specified interface. Generally necessary when connected to the network + through multiple adapters and a server binds to only one adapter at a time. + + Parameters + ---------- + interface_name: str + Ethernet, Wifi etc. + adapter_name: optional, str + name of the adapter if available + + Returns + ------- + str: + IP address of the interface + """ + adapters = ifaddr.get_adapters(include_unconfigured=True) + for adapter in adapters: + if not adapter_name: + for ip in adapter.ips: + if interface_name == ip.nice_name: + if ip.is_IPv4: + return ip.ip + elif adapter_name == adapter.nice_name: + for ip in adapter.ips: + if interface_name == ip.nice_name: + if ip.is_IPv4: + return ip.ip + raise ValueError(f"interface name {interface_name} not found in system interfaces.") + + +def format_exception_as_json(exc : Exception) -> typing.Dict[str, typing.Any]: + """ + return exception as a JSON serializable dictionary + """ + return { + "message" : str(exc), + "type" : repr(exc).split('(', 1)[0], + "traceback" : traceback.format_exc().splitlines(), + "notes" : exc.__notes__ if hasattr(exc, "__notes__") else None + } + + +def pep8_to_dashed_URL(word : str) -> str: """ Make an underscored, lowercase form from the expression in the string. Example:: - >>> underscore("DeviceType") - 'device_type' - As a rule of thumb you can think of :func:`underscore` as the inverse of - :func:`camelize`, though there are cases where that does not hold:: - >>> camelize(underscore("IOError")) - 'IoError' + >>> pep8_to_dashed_URL("device_type") + 'device-type' """ - word = re.sub(r"([A-Z]+)([A-Z][a-z])", r'\1_\2', word) - word = re.sub(r"([a-z\d])([A-Z])", r'\1_\2', word) - return word.lower().replace('_', '-') + return re.sub(r'_+', '-', word.lstrip('_').rstrip('_')) -def create_default_logger(name : str, log_level : int = logging.INFO, logfile = None, +def get_default_logger(name : str, log_level : int = logging.INFO, log_file = None, format : str = '%(levelname)-8s - %(asctime)s:%(msecs)03d - %(name)s - %(message)s' ) -> logging.Logger: """ - the default logger used by most of scadapy package. StreamHandler is always created, pass logfile for a FileHandler - as well. + the default logger used by most of hololinked package, when arguments are not modified. + StreamHandler is always created, pass log_file for a FileHandler as well. + + Parameters + ---------- + name: str + name of logger + log_level: int + log level + log_file: str + valid path to file + format: str + log format + + Returns + ------- + logging.Logger: + created logger """ logger = logging.getLogger(name) logger.setLevel(log_level) default_handler = logging.StreamHandler(sys.stdout) default_handler.setFormatter(logging.Formatter(format, datefmt='%Y-%m-%dT%H:%M:%S')) logger.addHandler(default_handler) - if logfile: - file_handler = logging.FileHandler(logfile) + if log_file: + file_handler = logging.FileHandler(log_file) file_handler.setFormatter(logging.Formatter(format, datefmt='%Y-%m-%dT%H:%M:%S')) logger.addHandler(file_handler) return logger -def run_coro_sync(coro): +def run_coro_sync(coro : typing.Coroutine): """ run coroutine synchronously """ - - eventloop = asyncio.get_event_loop() + try: + eventloop = asyncio.get_event_loop() + except RuntimeError: + eventloop = asyncio.new_event_loop() if eventloop.is_running(): - raise RuntimeError("asyncio event loop is already running, cannot setup coroutine to run sync, please await it or set schedule = True.") + raise RuntimeError(f"asyncio event loop is already running, cannot setup coroutine {coro.__name__} to run sync, please await it.") + # not the same as RuntimeError catch above. else: - eventloop.run_until_complete(coro) + return eventloop.run_until_complete(coro) -def run_coro_somehow(coro): +def run_callable_somehow(method : typing.Union[typing.Callable, typing.Coroutine]) -> typing.Any: """ - either schedule the coroutine or run it until its complete + run method if synchronous, or when async, either schedule a coroutine or run it until its complete """ - eventloop = asyncio.get_event_loop() + if not (asyncio.iscoroutinefunction(method) or asyncio.iscoroutine(method)): + return method() + try: + eventloop = asyncio.get_event_loop() + except RuntimeError: + eventloop = asyncio.new_event_loop() if eventloop.is_running(): - eventloop.call_soon(lambda : asyncio.create_task(coro)) + task = lambda : asyncio.create_task(method) # check later if lambda is necessary + eventloop.call_soon(task) else: - eventloop.run_until_complete(coro) + task = method + return eventloop.run_until_complete(task) -def get_signature(function : typing.Callable): - # Retrieve the names and types of arguments +def get_signature(callable : typing.Callable) -> typing.Tuple[typing.List[str], typing.List[type]]: + """ + Retrieve the names and types of arguments based on annotations for the given callable. + + Parameters + ---------- + callable: Callable + function or method (not tested with __call__) + + Returns + ------- + tuple: List[str], List[type] + arguments name and types respectively + """ arg_names = [] arg_types = [] - for param in inspect.signature(function).parameters.values(): + for param in inspect.signature(callable).parameters.values(): arg_name = param.name arg_type = param.annotation if param.annotation != inspect.Parameter.empty else None @@ -208,8 +160,36 @@ def get_signature(function : typing.Callable): return arg_names, arg_types - -__all__ = ['current_datetime_ms_str', 'wrap_text', 'copy_parameters', 'dashed_URL'] +def getattr_without_descriptor_read(instance, key): + """ + supply to inspect._get_members (not inspect.get_members) to avoid calling + __get__ on hardware attributes + """ + if key in instance.__dict__: + return instance.__dict__[key] + mro = mro = (instance.__class__,) + inspect.getmro(instance.__class__) + for base in mro: + if key in base.__dict__: + value = base.__dict__[key] + if isinstance(value, types.FunctionType): + method = getattr(instance, key, None) + if isinstance(method, types.MethodType): + return method + return value + # for descriptor, first try to find it in class dict or instance dict (for instance descriptors (per_instance_descriptor=True)) + # and then getattr from the instance. For descriptors/property, it will be mostly at above two levels. + return getattr(instance, key, None) # we can deal with None where we use this getter, so dont raise AttributeError + + +__all__ = [ + get_IP_from_interface.__name__, + format_exception_as_json.__name__, + pep8_to_dashed_URL.__name__, + get_default_logger.__name__, + run_coro_sync.__name__, + run_callable_somehow.__name__, + get_signature.__name__ +] diff --git a/hololinked/server/webserver_utils.py b/hololinked/server/webserver_utils.py deleted file mode 100644 index 5d22ed0..0000000 --- a/hololinked/server/webserver_utils.py +++ /dev/null @@ -1,143 +0,0 @@ -import logging -import textwrap -import typing -import ifaddr -from typing import Dict, Any, List -# from tabulate import tabulate -from tornado.httputil import HTTPServerRequest - -from .constants import CALLABLE, ATTRIBUTE, EVENT, FILE, IMAGE_STREAM -from .data_classes import HTTPServerEventData, HTTPServerResourceData, FileServerData -from .zmq_message_brokers import AsyncZMQClient, SyncZMQClient - - -def update_resources(resources : Dict[str, Dict[str, Dict[str, Any]]], add : Dict[str, Dict[str, Any]]) -> None: - file_server_routes = dict( - STATIC_ROUTES = dict(), - DYNAMIC_ROUTES = dict() - ) - for http_method, existing_map in resources.items(): - if http_method == 'FILE_SERVER': - continue - for URL_path, info in add[http_method].items(): - if isinstance(info, HTTPServerEventData): - existing_map["STATIC_ROUTES"][URL_path] = info - elif isinstance(info, HTTPServerResourceData): - info.compile_path() - if info.path_regex is None: - existing_map["STATIC_ROUTES"][info.path_format] = info - else: - existing_map["DYNAMIC_ROUTES"][info.path_format] = info - elif info["what"] == ATTRIBUTE or info["what"] == CALLABLE: - data = HTTPServerResourceData(**info) - data.compile_path() - if data.path_regex is None: - existing_map["STATIC_ROUTES"][data.path_format] = data - else: - existing_map["DYNAMIC_ROUTES"][data.path_format] = data - elif info["what"] == EVENT: - existing_map["STATIC_ROUTES"][URL_path] = HTTPServerEventData(**info) - elif info["what"] == IMAGE_STREAM: - existing_map["STATIC_ROUTES"][URL_path] = HTTPServerEventData(**info) - elif info["what"] == FILE: - data = FileServerData(**info) - data.compile_path() - if data.path_regex is None: - file_server_routes["STATIC_ROUTES"][data.path_format] = data - else: - file_server_routes["DYNAMIC_ROUTES"][data.path_format] = data - resources["FILE_SERVER"]["STATIC_ROUTES"].update(file_server_routes["STATIC_ROUTES"]) - resources["FILE_SERVER"]["STATIC_ROUTES"].update(file_server_routes["DYNAMIC_ROUTES"]) - - - -async def update_resources_using_client(resources : Dict[str, Dict[str, Any]], remote_object_info : List, - client : typing.Union[AsyncZMQClient, SyncZMQClient]) -> None: - from .remote_object import RemoteObjectDB - _, _, _, _, _, reply = await client.async_execute('/resources/http', raise_client_side_exception = True) - update_resources(resources, reply["returnValue"]) # type: ignore - _, _, _, _, _, reply = await client.read_attribute('/object-info', raise_client_side_exception = True) - remote_object_info.append(RemoteObjectDB.RemoteObjectInfo(**reply["returnValue"])) - - -def log_request(request : HTTPServerRequest, logger : typing.Optional[logging.Logger] = None) -> None: - if logger and logger.level == logging.DEBUG: # - # check log level manually before cooking this long string - text = """ - REQUEST: - path : {}, - host : {}, - host-name : {}, - remote ip : {}, - method : {}, - body type : {}, - body : {}, - arguments : {}, - query arguments : {}, - body arguments : {}, - header : {}" - """.format(request.path, request.host, request.host_name, - request.remote_ip, request.method, type(request.body), request.body, - request.arguments, request.query_arguments, request.body_arguments, - request.headers - ) - logger.debug(textwrap.dedent(text).lstrip()) - else: - text = """ - REQUEST: - path : {}, - host : {}, - host-name : {}, - remote ip : {}, - method : {}, - body type : {}, - body : {}, - arguments : {}, - query arguments : {}, - body arguments : {}, - header : {}" - """.format(request.path, request.host, request.host_name, - request.remote_ip, request.method, type(request.body), request.body, - request.arguments, request.query_arguments, request.body_arguments, - request.headers - ) - print(textwrap.dedent(text).lstrip()) - - -resources_table_headers = ["URL", "method"] -def log_resources(logger : logging.Logger, resources : Dict[str, Dict[str, Any]] ) -> None: - if logger.level == logging.DEBUG: - # check log level manually before cooking this long string - text = """ - GET resources : - {} - POST resources : - {} - PUT resources : - {} - """.format( - tabulate([[key] + [values[1]] for key, values in resources["GET"].items()] , headers=resources_table_headers, tablefmt="presto"), - tabulate([[key] + [values[1]] for key, values in resources["POST"].items()], headers=resources_table_headers, tablefmt="presto"), - tabulate([[key] + [values[1]] for key, values in resources["PUT"].items()] , headers=resources_table_headers, tablefmt="presto") - ) - logger.debug(textwrap.dedent(text).lstrip()) - - -def get_IP_from_interface(interface_name : str = 'Ethernet', adapter_name = None): - adapters = ifaddr.get_adapters(include_unconfigured=True) - for adapter in adapters: - if not adapter_name: - for ip in adapter.ips: - if interface_name == ip.nice_name: - if ip.is_IPv4: - return ip.ip - elif adapter_name == adapter.nice_name: - for ip in adapter.ips: - if interface_name == ip.nice_name: - if ip.is_IPv4: - return ip.ip - raise ValueError("interface name {} not found in system interfaces.".format(interface_name)) - raise ValueError("interface name {} not found in system interfaces.".format(interface_name)) - - -__all__ = ['log_request', 'log_resources'] \ No newline at end of file diff --git a/hololinked/server/zmq_message_brokers.py b/hololinked/server/zmq_message_brokers.py index 3ca1a8f..f695b8e 100644 --- a/hololinked/server/zmq_message_brokers.py +++ b/hololinked/server/zmq_message_brokers.py @@ -1,329 +1,709 @@ +import builtins import os +import time import zmq import zmq.asyncio import asyncio import logging import typing +# import jsonschema +from uuid import uuid4 +from collections import deque from enum import Enum -from typing import Union, List, Any, Dict, Sequence, Iterator, Set +from zmq.utils.monitor import parse_monitor_message - -from .utils import current_datetime_ms_str, create_default_logger, run_coro_somehow, run_coro_sync, wrap_text +from .utils import * from .config import global_config -from .serializers import (JSONSerializer, PickleSerializer, BaseSerializer, SerpentSerializer, # DillSerializer, - serializers) -from ..param.parameterized import Parameterized +from .constants import JSON, ZMQ_PROTOCOLS, CommonRPC, ServerTypes, ZMQSocketType, ZMQ_EVENT_MAP +from .serializers import BaseSerializer, JSONSerializer, _get_serializer_from_user_given_options +# message types HANDSHAKE = b'HANDSHAKE' +INVALID_MESSAGE = b'INVALID_MESSAGE' +TIMEOUT = b'TIMEOUT' INSTRUCTION = b'INSTRUCTION' REPLY = b'REPLY' -INVALID_MESSAGE = b'INVALID_MESSAGE' +EXCEPTION = b'EXCEPTION' +INTERRUPT = b'INTERRUPT' +ONEWAY = b'ONEWAY' +SERVER_DISCONNECTED = 'EVENT_DISCONNECTED' + EVENT = b'EVENT' EVENT_SUBSCRIPTION = b'EVENT_SUBSCRIPTION' SUCCESS = b'SUCCESS' -FAILURE = b'FAILURE' + +# empty data EMPTY_BYTE = b'' EMPTY_DICT = {} +# client types HTTP_SERVER = b'HTTP_SERVER' -PROXY = b'PROXY' - - - -class ServerTypes(Enum): - UNKNOWN_TYPE = b'UNKNOWN_TYPE' - EVENTLOOP = b'EVENTLOOP' - USER_REMOTE_OBJECT = b'USER_REMOTE_OBJECT' - POOL = b'POOL' - +PROXY = b'PROXY' +TUNNELER = b'TUNNELER' # message passer from inproc client to inrproc server within RPC + + +""" +Message indices + +client's message to server: |br| +[address, bytes(), client type, message type, message id, timeout, instruction, arguments, execution_context] |br| +[ 0 , 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 ] |br| + +[address, bytes(), server_type, message_type, message id, data]|br| +[ 0 , 1 , 2 , 3 , 4 , 5 ]|br| +""" +# CM = Client Message +CM_INDEX_ADDRESS = 0 +CM_INDEX_CLIENT_TYPE = 2 +CM_INDEX_MESSAGE_TYPE = 3 +CM_INDEX_MESSAGE_ID = 4 +CM_INDEX_TIMEOUT = 5 +CM_INDEX_INSTRUCTION = 6 +CM_INDEX_ARGUMENTS = 7 +CM_INDEX_EXECUTION_CONTEXT = 8 + +# SM = Server Message +SM_INDEX_ADDRESS = 0 +SM_INDEX_SERVER_TYPE = 2 +SM_INDEX_MESSAGE_TYPE = 3 +SM_INDEX_MESSAGE_ID = 4 +SM_INDEX_DATA = 5 + +# Server types - currently useless metadata + +byte_types = (bytes, bytearray, memoryview) + + +# Function to get the socket type name from the enum +def get_socket_type_name(socket_type): + try: + return ZMQSocketType(socket_type).name + except ValueError: + return "UNKNOWN" + class BaseZMQ: - + """ + Base class for all ZMQ message brokers. Implements socket creation, logger, serializer instantiation + which is common to all server and client implementations. For HTTP clients, json_serializer is necessary and + for RPC clients, any of the allowed serializer is possible. + + Parameters + ---------- + instance_name: str + instance name of the serving ``Thing`` + server_type: Enum + metadata about the nature of the server + json_serializer: hololinked.server.serializers.JSONSerializer + serializer used to send message to HTTP Server + rpc_serializer: any of hololinked.server.serializers.serializer, default serpent + serializer used to send message to RPC clients + logger: logging.Logger, Optional + logger, on will be created while creating a socket automatically if None supplied + """ + # init of this class must always take empty arguments due to inheritance structure def __init__(self) -> None: - # only type definition for logger + self.instance_name : str self.logger : logging.Logger def exit(self) -> None: - raise NotImplementedError("implement exit() to gracefully exit ZMQ in {}.".format(self.__class__)) + """ + Cleanup method to terminate ZMQ sockets and contexts before quitting. Called by `__del__()` + automatically. Each subclass server/client should implement their version of exiting if necessary. + """ + if self.logger is None: + self.logger = get_default_logger('{}|{}'.format(self.__class__.__name__, self.instance_name)) def __del__(self) -> None: self.exit() - def create_socket(self, context : Union[zmq.asyncio.Context, zmq.Context], instance_name : str, identity : str, - bind : bool = False, protocol : str = "IPC", socket_address : Union[str, None] = None) -> None: + def create_socket(self, *, identity : str, bind : bool, context : typing.Union[zmq.asyncio.Context, zmq.Context], + protocol : ZMQ_PROTOCOLS = ZMQ_PROTOCOLS.IPC, + socket_type : zmq.SocketType = zmq.ROUTER, **kwargs) -> None: + """ + Create a socket with certain specifications. When successful, a logger is also created. Supported ZeroMQ protocols + are TCP, IPC & INPROC. For IPC sockets, a file is created under TEMP_DIR of global configuration. + + Parameters + ---------- + identity: str + especially useful for clients to have a different ID other than the ``instance_name`` of the ``Thing``. + For servers, supplying the ``instance_name`` is sufficient. + bind: bool + whether to bind (server) or connect (client) + context: zmq.Context or zmq.asyncio.Context + ZeroMQ Context object that creates the socket + protocol: Enum + TCP, IPC or INPROC. Message crafting/passing/routing is protocol invariant as suggested by ZeroMQ docs. + socket_type: zmq.SocketType, default zmq.ROUTER + Usually a ROUTER socket is implemented for both client-server and peer-to-peer communication + **kwargs: dict + socket_address: str + applicable only for TCP socket to find the correct socket to connect. + log_level: int + logging.Logger log level + + Returns + ------- + None + + Raises + ------ + NotImplementedError + if protocol other than TCP, IPC or INPROC is used + RuntimeError + if protocol is TCP, a socket connect from client side is requested but a socket address is not supplied + """ self.context = context self.identity = identity - self.socket = self.context.socket(zmq.ROUTER) + self.socket = self.context.socket(socket_type) self.socket.setsockopt_string(zmq.IDENTITY, identity) - if protocol == "IPC": - split_instance_name = instance_name.split('/') - socket_dir = '\\' + '\\'.join(split_instance_name[:-1]) if len(split_instance_name) > 1 else '' - directory = global_config.APPDATA_DIR + socket_dir - if not os.path.exists(directory): - os.makedirs(directory) - # re-compute for IPC - socket_address = "ipc://{}\\{}.ipc".format(directory, split_instance_name[-1]) + socket_address = kwargs.get('socket_address', None) + if protocol == ZMQ_PROTOCOLS.IPC or protocol == "IPC": + if socket_address is None: + split_instance_name = self.instance_name.split('/') + socket_dir = os.sep + os.sep.join(split_instance_name[:-1]) if len(split_instance_name) > 1 else '' + directory = global_config.TEMP_DIR + socket_dir + if not os.path.exists(directory): + os.makedirs(directory) + # re-compute for IPC because it looks for a file in a directory + socket_address = "ipc://{}{}{}.ipc".format(directory, os.sep, split_instance_name[-1]) if bind: self.socket.bind(socket_address) else: self.socket.connect(socket_address) - elif protocol == "TCP": + elif protocol == ZMQ_PROTOCOLS.TCP or protocol == "TCP": if bind: - for i in range(1000, 65535): - socket_address = "tcp://*:{}".format(i) - try: - self.socket.bind(socket_address) - break - except: - pass - elif socket_address: + if not socket_address: + for i in range(global_config.TCP_SOCKET_SEARCH_START_PORT, global_config.TCP_SOCKET_SEARCH_END_PORT): + socket_address = "tcp://0.0.0.0:{}".format(i) + try: + self.socket.bind(socket_address) + break + except zmq.error.ZMQError as ex: + if not ex.strerror.startswith('Address in use'): + raise ex from None + else: + self.socket.bind(socket_address) + elif socket_address: + self.socket.connect(socket_address) + else: + raise RuntimeError(f"Socket address not supplied for TCP connection to identity - {identity}") + elif protocol == ZMQ_PROTOCOLS.INPROC or protocol == "INPROC": + # inproc_instance_name = instance_name.replace('/', '_').replace('-', '_') + if socket_address is None: + socket_address = f'inproc://{self.instance_name}' + if bind: + self.socket.bind(socket_address) + else: self.socket.connect(socket_address) else: - raise NotImplementedError("protocols other than IPC & TCP are not implemented now for {}".format(self.__class__)) - self.logger = self.get_logger(self.identity, socket_address, class_ = self.__class__.__name__) # type: ignore - self.logger.info("created socket with address {} and {}".format(socket_address, "bound" if bind else "connected")) - - @classmethod - def get_logger(cls, identity : str, socket_address : str, level = logging.DEBUG, class_ = 'BaseZMQ') -> logging.Logger: - if socket_address.endswith('ipc'): - socket_address = socket_address.split('\\')[-1] - socket_address.strip('.ipc') - class_ = class_.split('.')[-1] - name = '{}|{}|{}'.format(class_, identity, socket_address) - return create_default_logger(name, level) - + raise NotImplementedError("protocols other than IPC, TCP & INPROC are not implemented now for {}".format( + self.__class__) + + f" Given protocol {protocol}.") + self.socket_address = socket_address + if not self.logger: + self.logger = get_default_logger('{}|{}|{}|{}'.format(self.__class__.__name__, + socket_type, protocol, identity), kwargs.get('log_level', logging.INFO)) + self.logger.info("created socket {} with address {} and {}".format(get_socket_type_name(socket_type), socket_address, + "bound" if bind else "connected")) + class BaseAsyncZMQ(BaseZMQ): """ - Creates, binds/connects am async router socket with either TCP or ICP protocol. A context is create if none is supplied. - For IPC sockets, a file is created under TEMP_DIR of global configuration. + Base class for all async ZMQ servers and clients. """ + # init of this class must always take empty arguments due to inheritance structure - def create_socket(self, context : Union[zmq.asyncio.Context, None], instance_name : str, identity : str, - bind : bool = False, protocol : str = "IPC", socket_address : Union[str, None] = None) -> None: + def create_socket(self, *, identity : str, bind : bool = False, context : typing.Optional[zmq.asyncio.Context] = None, + protocol : str = "IPC", socket_type : zmq.SocketType = zmq.ROUTER, **kwargs) -> None: + """ + Overloads ``create_socket()`` to create, bind/connect an async socket. A async context is created if none is supplied. + """ + if context and not isinstance(context, zmq.asyncio.Context): + raise TypeError("async ZMQ message broker accepts only async ZMQ context. supplied type {}".format(type(context))) context = context or zmq.asyncio.Context() - super().create_socket(context, instance_name, identity, bind, protocol, socket_address) + super().create_socket(identity=identity, bind=bind, context=context, protocol=protocol, + socket_type=socket_type, **kwargs) class BaseSyncZMQ(BaseZMQ): + """ + Base class for all sync ZMQ servers and clients. + """ + # init of this class must always take empty arguments due to inheritance structure - def create_socket(self, context : Union[zmq.Context, None], instance_name : str, identity : str, - bind : bool = False, protocol : str = "IPC", socket_address : Union[str, None] = None) -> None: + def create_socket(self, *, identity : str, bind : bool = False, context : typing.Optional[zmq.Context] = None, + protocol : str = "IPC", socket_type : zmq.SocketType = zmq.ROUTER, **kwargs) -> None: + """ + Overloads ``create_socket()`` to create, bind/connect a synchronous socket. A (synchronous) context is created + if none is supplied. + """ + if context and not isinstance(context, zmq.Context): + raise TypeError("sync ZMQ message broker accepts only sync ZMQ context. supplied type {}".format(type(context))) context = context or zmq.Context() - super().create_socket(context, instance_name, identity, bind, protocol, socket_address) + super().create_socket(identity=identity, bind=bind, context=context, protocol=protocol, + socket_type=socket_type, **kwargs) + class BaseZMQServer(BaseZMQ): """ - This class implements serializer instantiation and message handling for ZMQ servers and can be subclassed by all server instances - irrespective of sync or async. The messaging contract does not depend on sync or async implementation. - For HTTP clients, json_serializer is necessary and for other types of clients, any of the allowed serializer is possible. - """ - - def __init__(self, server_type : Enum, json_serializer : Union[None, JSONSerializer] = None, - proxy_serializer : Union[str, BaseSerializer, None] = None) -> None: - if json_serializer is None or isinstance(json_serializer, JSONSerializer): - self.json_serializer = json_serializer or JSONSerializer() - else: - raise ValueError("invalid JSON serializer option for {}. Given option : {}".format(self.__class__, json_serializer)) - if isinstance(proxy_serializer, (PickleSerializer, SerpentSerializer, JSONSerializer)): # , DillSerializer)): - self.proxy_serializer = proxy_serializer - elif isinstance(proxy_serializer, str) or proxy_serializer is None: - self.proxy_serializer = serializers.get(proxy_serializer, SerpentSerializer)() - else: - raise ValueError("invalid proxy serializer option for {}. Given option : {}".format(self.__class__, proxy_serializer)) - self.server_type : Enum = server_type + Implements messaging contract as suggested by ZMQ, this is defined as + as follows: + + client's message to server: + :: + [address, bytes(), client type, message type, messsage id, + [ 0 , 1 , 2 , 3 , 4 , + + timeout, instruction, arguments, execution context] + 5 , 6 , 7 , 8 ] + + server's message to client: + :: + [address, bytes(), server_type, message_type, message id, data, encoded_data] + [ 0 , 1 , 2 , 3 , 4 , 5 , 6 ] + + The messaging contract does not depend on sync or async implementation. + """ + def __init__(self, + instance_name : str, + server_type : typing.Union[bytes, str], + json_serializer : typing.Union[None, JSONSerializer] = None, + rpc_serializer : typing.Union[str, BaseSerializer, None] = None, + logger : typing.Optional[logging.Logger] = None, + **kwargs + ) -> None: super().__init__() + self.rpc_serializer, self.json_serializer = _get_serializer_from_user_given_options( + rpc_serializer=rpc_serializer, + json_serializer=json_serializer + ) + self.instance_name = instance_name + self.server_type = server_type if isinstance(server_type, bytes) else bytes(server_type, encoding='utf-8') + self.logger = logger - def parse_client_message(self, message : List[bytes]) -> Any: + + def parse_client_message(self, message : typing.List[bytes]) -> typing.List[typing.Union[bytes, typing.Any]]: """ - client's message to server looks as follows: - [address, bytes(), client type, message type, msg id, instruction, arguments, execution_context] - [ 0 , 1 , 2 , 3 , 4 , 5 , 6 , 7 ] + deserializes important parts of the client's message, namely instruction, arguments, execution context + based on the client type. For handshake messages, automatically handles handshake. In case of exceptions while + deserializing, automatically sends an invalid message to client informing the nature of exception with the + exception metadata. + + client's message to server: + :: + [address, bytes(), client type, message type, messsage id, + [ 0 , 1 , 2 , 3 , 4 , + + timeout, instruction, arguments, execution context] + 5 , 6 , 7 , 8 ] + + Execution Context Definitions (typing.Dict[str, typing.Any] or JSON): + - "oneway" - does not reply to client after executing the instruction + - "fetch_execution_logs" - fetches logs that were accumulated while execution + + Parameters + ---------- + message: List[bytes] + message received from client + + Returns + ------- + message: List[bytes | Any] + message with instruction, arguments and execution context deserialized + """ try: - message_type = message[3] + message_type = message[CM_INDEX_MESSAGE_TYPE] if message_type == INSTRUCTION: - client_type = message[2] + client_type = message[CM_INDEX_CLIENT_TYPE] if client_type == PROXY: - message[5] = self.proxy_serializer.loads(message[5]) # type: ignore - message[6] = self.proxy_serializer.loads(message[6]) # type: ignore - message[7] = self.proxy_serializer.loads(message[6]) # type: ignore + message[CM_INDEX_INSTRUCTION] = self.rpc_serializer.loads(message[CM_INDEX_INSTRUCTION]) # type: ignore + message[CM_INDEX_ARGUMENTS] = self.rpc_serializer.loads(message[CM_INDEX_ARGUMENTS]) # type: ignore + message[CM_INDEX_EXECUTION_CONTEXT] = self.rpc_serializer.loads(message[CM_INDEX_EXECUTION_CONTEXT]) # type: ignore elif client_type == HTTP_SERVER: - message[5] = self.json_serializer.loads(message[5]) # type: ignore - message[6] = self.json_serializer.loads(message[6]) # type: ignore - message[7] = self.json_serializer.loads(message[7]) # type: ignore + message[CM_INDEX_INSTRUCTION] = self.json_serializer.loads(message[CM_INDEX_INSTRUCTION]) # type: ignore + message[CM_INDEX_ARGUMENTS] = self.json_serializer.loads(message[CM_INDEX_ARGUMENTS]) # type: ignore + message[CM_INDEX_EXECUTION_CONTEXT] = self.json_serializer.loads(message[CM_INDEX_EXECUTION_CONTEXT]) # type: ignore return message elif message_type == HANDSHAKE: - self.handshake(message[0]) - except Exception as E: - self.handle_invalid_message(message, E) - - def craft_reply_from_arguments(self, address : bytes, message_type : bytes, message_id : bytes = b'', - data : Any = None) -> List[bytes]: + self.handshake(message) + except Exception as ex: + self.handle_invalid_message(message, ex) + + + def craft_reply_from_arguments(self, address : bytes, client_type: bytes, message_type : bytes, + message_id : bytes = b'', data : typing.Any = None, + pre_encoded_data : typing.Optional[bytes] = EMPTY_BYTE) -> typing.List[bytes]: + """ + call this method to craft an arbitrary reply or message to the client using the method arguments. + + server's message to client: + :: + [address, bytes(), server_type, message_type, message id, data] + [ 0 , 1 , 2 , 3 , 4 , 5 ] + + Parameters + ---------- + address: bytes + the ROUTER address of the client + message_type: bytes + type of the message, possible values are b'REPLY', b'HANDSHAKE' and b'TIMEOUT' + message_id: bytes + message id of the original client message for which the reply is being crafted + data: Any + serializable data + + Returns + ------- + message: List[bytes] + the crafted reply with information in the correct positions within the list + """ + if client_type == HTTP_SERVER: + data = self.json_serializer.dumps(data) + elif client_type == PROXY: + data = self.rpc_serializer.dumps(data) + return [ address, - bytes(), - self.server_type.value, + EMPTY_BYTE, + self.server_type, message_type, message_id, - self.json_serializer.dumps(data) + data, + pre_encoded_data ] - def craft_reply_from_client_message(self, original_client_message : List[bytes], data : Any = None, **overrides) -> List[bytes]: + + def craft_reply_from_client_message(self, original_client_message : typing.List[bytes], data : typing.Any = None, + pre_encoded_data : bytes = EMPTY_BYTE) -> typing.List[bytes]: """ - client's message to server looks as follows: - [address, bytes(), client type, message type, msg id, instruction, arguments, execution_context] - [ 0 , 1 , 2 , 3 , 4 , 5 , 6 , 7 ] + craft a reply with certain data automatically from an originating client message. The client's address, type required + for serialization requirements, message id etc. are automatically created from the original message. + + server's message to client: + :: + [address, bytes(), server_type, message_type, message id, data] + [ 0 , 1 , 2 , 3 , 4 , 5 ] + + Parameters + ---------- + original_client_message: List[bytes] + The message originated by the clieht for which the reply is being crafted + data: Any + serializable data + + Returns + ------- + message: List[bytes] + the crafted reply with information in the correct positions within the list """ - client_type = original_client_message[2] + client_type = original_client_message[CM_INDEX_CLIENT_TYPE] if client_type == HTTP_SERVER: data = self.json_serializer.dumps(data) elif client_type == PROXY: - data = self.proxy_serializer.dumps(data) + data = self.rpc_serializer.dumps(data) else: - raise ValueError("invalid client type given '{}' for preparing message to send from '{}' of type {}".format( - client_type, self.identity, self.__class__)) - reply = [ - original_client_message[0], - bytes(), - self.server_type.value, - REPLY, - original_client_message[4], - data - ] - if len(overrides) == 0: - return reply - for key, value in overrides.items(): - if key == 'message_type': - reply[3] = value - return reply - - def handshake(self, address : bytes) -> None: - run_coro_somehow(self._handshake(address)) - - def handle_invalid_message(self, message : List[bytes], exception : Exception) -> None: - run_coro_somehow(self._handle_invalid_message(message, exception)) - - async def _handshake(self, address : bytes) -> None: - raise NotImplementedError("handshake cannot be completed - implement _handshake in {} to complete handshake.".format(self.__class__)) + raise ValueError(f"invalid client type given '{client_type}' for preparing message to send from " + + f"'{self.identity}' of type {self.__class__}.") + return [ + original_client_message[CM_INDEX_ADDRESS], + EMPTY_BYTE, + self.server_type, + REPLY, + original_client_message[CM_INDEX_MESSAGE_ID], + data, + pre_encoded_data + ] - async def _handle_invalid_message(self, message : List[bytes], exception : Exception) -> None: - raise NotImplementedError( - wrap_text("invalid message cannot be handled - implement _handle_invalid_message in {} to handle invalid messages.".format( - self.__class__))) + def handshake(self, original_client_message : typing.List[bytes]) -> None: + """ + pass a handshake message to client. Absolutely mandatory to ensure initial messages do not get lost + because of ZMQ's very tiny but significant initial delay after creating socket. + + Parameters + ---------- + address: bytes + the address of the client to send the handshake - -class AsyncZMQServer(BaseZMQServer, BaseAsyncZMQ): + Returns + ------- + None + """ + run_callable_somehow(self._handshake(original_client_message)) + + def _handshake(self, original_client_message : typing.List[bytes]) -> None: + raise NotImplementedError(f"handshake cannot be handled - implement _handshake in {self.__class__} to handshake.") + + + def handle_invalid_message(self, original_client_message : typing.List[bytes], exception : Exception) -> None: + """ + pass an invalid message to the client when an exception occurred while parsing the message from the client + (``parse_client_message()``) + + Parameters + ---------- + original_client_message: List[bytes] + the client message parsing which the exception occurred + exception: Exception + exception object raised + + Returns + ------- + None + """ + run_callable_somehow(self._handle_invalid_message(original_client_message, exception)) + + def _handle_invalid_message(self, message : typing.List[bytes], exception : Exception) -> None: + raise NotImplementedError("invalid message cannot be handled" + + f" - implement _handle_invalid_message in {self.__class__} to handle invalid messages.") + + + def handle_timeout(self, original_client_message : typing.List[bytes]) -> None: + """ + pass timeout message to the client when the instruction could not be executed within specified timeout + + Parameters + ---------- + original_client_message: List[bytes] + the client message which could not executed within the specified timeout. timeout value is + generally specified within the execution context values. + + Returns + ------- + None + """ + run_callable_somehow(self._handle_timeout(original_client_message)) + + def _handle_timeout(self, original_client_message : typing.List[bytes]) -> None: + raise NotImplementedError("timeouts cannot be handled ", + f"- implement _handle_timeout in {self.__class__} to handle timeout.") + + + +class BaseAsyncZMQServer(BaseZMQServer): + """ + Common to all async ZMQ servers """ - ZMQ Server to be used by remote objects, this server will handle handshakes, event subscription notifications, - & instructions. + + async def _handshake(self, original_client_message : typing.List[bytes]) -> None: + """ + Inner method that handles handshake. scheduled by ``handshake()`` method, signature same as ``handshake()``. + """ + await self.socket.send_multipart(self.craft_reply_from_arguments(original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], HANDSHAKE, original_client_message[CM_INDEX_MESSAGE_ID], + EMPTY_BYTE)) + self.logger.info(f"sent handshake to client '{original_client_message[CM_INDEX_ADDRESS]}'") + + + async def _handle_timeout(self, original_client_message : typing.List[bytes]) -> None: + """ + Inner method that handles timeout. scheduled by ``handle_timeout()``, signature same as ``handle_timeout``. + """ + await self.socket.send_multipart(self.craft_reply_from_arguments(original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], TIMEOUT, original_client_message[CM_INDEX_MESSAGE_ID])) + self.logger.info(f"sent timeout to client '{original_client_message[CM_INDEX_ADDRESS]}'") - message from client to server : + + async def _handle_invalid_message(self, original_client_message : typing.List[bytes], exception : Exception) -> None: + """ + Inner method that handles invalid messages. scheduled by ``handle_invalid_message()``, + signature same as ``handle_invalid_message()``. + """ + await self.socket.send_multipart(self.craft_reply_from_arguments(original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], INVALID_MESSAGE, + original_client_message[CM_INDEX_MESSAGE_ID]), exception) + self.logger.info(f"sent exception message to client '{original_client_message[CM_INDEX_ADDRESS]}'." + + f" exception - {str(exception)}") - [address, bytes(), client type, message type, msg id, instruction, arguments] - [ 0 , 1 , 2 , 3 , 4 , 5 , 6 ] - Handshake - [client address, bytes, client_type, HANDSHAKE] + +class AsyncZMQServer(BaseAsyncZMQServer, BaseAsyncZMQ): + """ + Implements blocking (non-polled) but async receive instructions and send replies. """ - def __init__(self, instance_name : str, server_type : Enum, context : Union[zmq.asyncio.Context, None] = None, - **kwargs) -> None: - BaseZMQServer.__init__(self, server_type, json_serializer = kwargs.get('json_serializer'), - proxy_serializer = kwargs.get('proxy_serializer', None)) + def __init__(self, *, instance_name : str, server_type : Enum, context : typing.Union[zmq.asyncio.Context, None] = None, + protocol : ZMQ_PROTOCOLS = ZMQ_PROTOCOLS.IPC, socket_type : zmq.SocketType = zmq.ROUTER, **kwargs) -> None: + BaseAsyncZMQServer.__init__(self, instance_name=instance_name, server_type=server_type, **kwargs) BaseAsyncZMQ.__init__(self) - self.instance_name = instance_name - self.create_socket(context, instance_name, instance_name, bind = True) + self.create_socket(identity=instance_name, bind=True, context=context, protocol=protocol, + socket_type=socket_type, **kwargs) self._terminate_context = context == None # terminate if it was created by instance - async def _handshake(self, address : bytes) -> None: - await self.socket.send_multipart(self.craft_reply_from_arguments(address, HANDSHAKE)) - self.logger.info("sent handshake to client '{}'".format(address)) - - async def _handle_invalid_message(self, original_client_message : List[bytes], exception : Exception) -> None: - await self.socket.send_multipart(self.craft_reply_from_client_message(original_client_message, exception, - message_type = INVALID_MESSAGE)) - self.logger.info("sent exception message to client '{}' : '{}'".format(original_client_message[0], str(exception))) - - async def async_recv_instruction(self) -> Any: + + async def async_recv_instruction(self) -> typing.Any: + """ + Receive one instruction in a blocking form. Async for multi-server paradigm, each server should schedule + this method in the event loop explicitly. This is taken care by the ``Eventloop`` & ``RPCServer``. + + Returns + ------- + instruction: List[bytes | Any] + received instruction with important content (instruction, arguments, execution context) deserialized. + """ while True: instruction = self.parse_client_message(await self.socket.recv_multipart()) if instruction: - self.logger.debug("received instruction from client '{}' with msg-ID '{}'".format(instruction[0], - instruction[4])) + self.logger.debug(f"received instruction from client '{instruction[CM_INDEX_ADDRESS]}' with msg-ID {instruction[CM_INDEX_MESSAGE_ID]}") return instruction - async def async_recv_instructions(self, strip_delimiter = False) -> List[Any]: + + async def async_recv_instructions(self) -> typing.List[typing.Any]: + """ + Receive all currently available instructions in blocking form. Async for multi-server paradigm, each server should schedule + this method in the event loop explicitly. This is taken care by the ``Eventloop`` & ``RPCServer``. + + Returns + ------- + instructions: List[List[bytes | Any]] + list of received instructions with important content (instruction, arguments, execution context) deserialized. + """ instructions = [await self.async_recv_instruction()] while True: try: instruction = self.parse_client_message(await self.socket.recv_multipart(zmq.NOBLOCK)) if instruction: - self.logger.debug("received instruction from client '{}' with msg-ID '{}'".format(instruction[0], - instruction[4])) + self.logger.debug(f"received instruction from client '{instruction[CM_INDEX_ADDRESS]}' with msg-ID {instruction[CM_INDEX_MESSAGE_ID]}") instructions.append(instruction) except zmq.Again: break return instructions - async def async_send_reply(self, original_client_message : List[bytes], data : Any) -> None: - reply = self.craft_reply_from_client_message(original_client_message, data) - await self.socket.send_multipart(reply) - self.logger.debug("sent reply to client '{}' with msg-ID '{}'".format(reply[0], reply[4])) - - async def async_send_event_subscription(self, consumer : str) -> None: - await self.socket.send_multipart([consumer, bytes(), EVENT_SUBSCRIPTION]) + + async def async_send_reply(self, original_client_message : typing.List[bytes], data : typing.Any) -> None: + """ + Send reply for an instruction. + + Parameters + ---------- + original_client_message: List[bytes] + original message so that the reply can be properly crafted and routed + data: Any + serializable data to be sent as reply + + Returns + ------- + None + """ + await self.socket.send_multipart(self.craft_reply_from_client_message(original_client_message, data)) + self.logger.debug(f"sent reply to client '{original_client_message[CM_INDEX_ADDRESS]}' with msg-ID {original_client_message[CM_INDEX_MESSAGE_ID]}") + + + async def async_send_reply_with_message_type(self, original_client_message : typing.List[bytes], + message_type: bytes, data : typing.Any) -> None: + """ + Send reply for an instruction. + + Parameters + ---------- + original_client_message: List[bytes] + original message so that the reply can be properly crafted and routed + data: Any + serializable data to be sent as reply + + Returns + ------- + None + """ + await self.socket.send_multipart(self.craft_reply_from_arguments(original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], message_type, + original_client_message[CM_INDEX_MESSAGE_ID], data)) + self.logger.debug(f"sent reply to client '{original_client_message[CM_INDEX_ADDRESS]}' with msg-ID {original_client_message[CM_INDEX_MESSAGE_ID]}") + def exit(self) -> None: + """ + closes socket and context, warns if any error occurs. + """ + super().exit() try: self.socket.close(0) - self.logger.info("terminated socket of server '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated socket '{}'of type '{}'. Exception message : {}".format( - self.identity, self.__class__, str(E))) + self.logger.info(f"terminated socket of server '{self.identity}' of type {self.__class__}") + except Exception as ex: + self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated " + + f" socket '{self.identity}' of type {self.__class__}. Exception message : {str(ex)}") try: if self._terminate_context: self.context.term() self.logger.info("terminated context of socket '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context '{}'. Exception message : {}".format( - self.identity, str(E))) + except Exception as ex: + self.logger.warn("could not properly terminate context or attempted to terminate an already terminated " + + f" context '{self.identity}'. Exception message : {str(ex)}") class AsyncPollingZMQServer(AsyncZMQServer): """ - The purpose of this server to be identical to AsyncZMQServer except that it can be stopped from server side. - This is achieved by polling the socket instead of waiting indefinitely on the socket. - + Identical to AsyncZMQServer, except that instructions are received in non-blocking/polling form. + This server can be stopped from server side by calling ``stop_polling()`` unlike ``AsyncZMQServer`` which + cannot be stopped manually unless an instruction arrives. + + Parameters + ---------- + instance_name: str + ``instance_name`` of the Thing which the server serves + server_type: str + server type metadata - currently not useful/important + context: Optional, zmq.asyncio.Context + ZeroMQ Context object to use. All sockets share this context. Automatically created when None is supplied. + socket_type : zmq.SocketType, default zmq.ROUTER + socket type of ZMQ socket, default is ROUTER (enables address based routing of messages) + protocol: Enum, default ZMQ_PROTOCOLS.IPC + Use TCP for network access, IPC for multi-process applications, and INPROC for multi-threaded applications. + poll_timeout: int, default 25 + time in milliseconds to poll the sockets specified under ``procotols``. Useful for calling ``stop_polling()`` + where the max delay to stop polling will be ``poll_timeout`` + + **kwargs: + json_serializer: hololinked.server.serializers.JSONSerializer + serializer used to send message to HTTP Server + rpc_serializer: any of hololinked.server.serializers.serializer, default serpent + serializer used to send message to RPC clients """ - def __init__(self, instance_name : str, server_type : Enum, context : Union[zmq.asyncio.Context, None] = None, - poll_timeout = 25, **kwargs) -> None: - super().__init__(instance_name, server_type, context, **kwargs) + def __init__(self, *, instance_name : str, server_type : Enum, context : typing.Union[zmq.asyncio.Context, None] = None, + socket_type : zmq.SocketType = zmq.ROUTER, protocol : ZMQ_PROTOCOLS = ZMQ_PROTOCOLS.IPC, + poll_timeout = 25, **kwargs) -> None: + super().__init__(instance_name=instance_name, server_type=server_type, context=context, + socket_type=socket_type, protocol=protocol, **kwargs) self.poller = zmq.asyncio.Poller() self.poller.register(self.socket, zmq.POLLIN) self.poll_timeout = poll_timeout @property def poll_timeout(self) -> int: + """ + socket polling timeout in milliseconds greater than 0. + """ return self._poll_timeout @poll_timeout.setter def poll_timeout(self, value) -> None: if not isinstance(value, int) or value < 0: - raise ValueError("polling period must be an integer greater than 0, not {}. Value is considered in milliseconds.".format(value)) + raise ValueError(f"polling period must be an integer greater than 0, not {value}. Value is considered in milliseconds.") self._poll_timeout = value - async def poll_instructions(self) -> List[List[bytes]]: + async def poll_instructions(self) -> typing.List[typing.List[bytes]]: + """ + poll for instructions with specified timeout (``poll_timeout``) and return if any instructions are available. + This method blocks, so make sure other methods are scheduled which can stop polling. + + Returns + ------- + instructions: List[List[bytes]] + list of received instructions with important content (instruction, arguments, execution context) deserialized. + """ self.stop_poll = False instructions = [] while not self.stop_poll: - sockets = await self.poller.poll(self.poll_timeout) # type hints dont work in this line + sockets = await self.poller.poll(self._poll_timeout) # type hints dont work in this line for socket, _ in sockets: while True: try: @@ -332,47 +712,74 @@ async def poll_instructions(self) -> List[List[bytes]]: break else: if instruction: - self.logger.debug("received instruction from client '{}' with msg-ID '{}'".format(instruction[0], - instruction[4])) + self.logger.debug(f"received instruction from client '{instruction[CM_INDEX_ADDRESS]}' with msg-ID {instruction[CM_INDEX_MESSAGE_ID]}") instructions.append(instruction) if len(instructions) > 0: break return instructions def stop_polling(self) -> None: + """ + stop polling and unblock ``poll_instructions()`` method + """ self.stop_poll = True def exit(self) -> None: - self.poller.unregister(self.socket) + """ + unregister socket from poller and terminate socket and context. + """ + try: + BaseZMQ.exit(self) + self.poller.unregister(self.socket) + except Exception as ex: + self.logger.warn(f"could not unregister socket {self.identity} from polling - {str(ex)}") return super().exit() class ZMQServerPool(BaseZMQServer): + """ + Implements pool of async ZMQ servers (& their sockets) + """ - def __init__(self, instance_names : Union[List[str], None] = None, **kwargs) -> None: + def __init__(self, *, instance_names : typing.Union[typing.List[str], None] = None, **kwargs) -> None: self.context = zmq.asyncio.Context() - self.pool : Dict[str, Union[AsyncZMQServer, AsyncPollingZMQServer]] = dict() self.poller = zmq.asyncio.Poller() + self.pool = dict() # type: typing.Dict[str, typing.Union[AsyncZMQServer, AsyncPollingZMQServer]] if instance_names: for instance_name in instance_names: - self.pool[instance_name] = AsyncZMQServer(instance_name, ServerTypes.UNKNOWN_TYPE, self.context, - **kwargs) + self.pool[instance_name] = AsyncZMQServer(instance_name=instance_name, + server_type=ServerTypes.UNKNOWN_TYPE.value, context=self.context, **kwargs) for server in self.pool.values(): self.poller.register(server.socket, zmq.POLLIN) - super().__init__(server_type = ServerTypes.POOL, json_serializer = kwargs.get('json_serializer'), - proxy_serializer = kwargs.get('proxy_serializer', None)) - - def register_server(self, server : Union[AsyncZMQServer, AsyncPollingZMQServer]) -> None: + super().__init__(instance_name="pool", server_type=ServerTypes.POOL.value, **kwargs) + self.identity = "pool" + if self.logger is None: + self.logger = get_default_logger("pool|polling", kwargs.get('log_level',logging.INFO)) + + def create_socket(self, *, identity : str, bind: bool, context : typing.Union[zmq.asyncio.Context, zmq.Context], + protocol : ZMQ_PROTOCOLS = ZMQ_PROTOCOLS.IPC, socket_type : zmq.SocketType = zmq.ROUTER, **kwargs) -> None: + raise NotImplementedError("create socket not supported by ZMQServerPool") + # we override this method to prevent socket creation. instance_name set to pool is simply a filler + return super().create_socket(identity=identity, bind=bind, context=context, protocol=protocol, + socket_type=socket_type, **kwargs) + + def register_server(self, server : typing.Union[AsyncZMQServer, AsyncPollingZMQServer]) -> None: + if not isinstance(server, (AsyncZMQServer, AsyncPollingZMQServer)): + raise TypeError("registration possible for servers only subclass of AsyncZMQServer or AsyncPollingZMQServer." + + f" Given type {type(server)}") self.pool[server.instance_name] = server self.poller.register(server.socket, zmq.POLLIN) - def deregister_server(self, server : Union[AsyncZMQServer, AsyncPollingZMQServer]) -> None: + def deregister_server(self, server : typing.Union[AsyncZMQServer, AsyncPollingZMQServer]) -> None: self.poller.unregister(server.socket) self.pool.pop(server.instance_name) @property def poll_timeout(self) -> int: + """ + socket polling timeout in milliseconds greater than 0. + """ return self._poll_timeout @poll_timeout.setter @@ -381,20 +788,53 @@ def poll_timeout(self, value) -> None: raise ValueError("polling period must be an integer greater than 0, not {}. Value is considered in milliseconds.".format(value)) self._poll_timeout = value - async def async_recv_instruction(self, instance_name : str) -> Any: + async def async_recv_instruction(self, instance_name : str) -> typing.List: + """ + receive instruction for instance name + + Parameters + ---------- + instance_name : str + instance name of the ``Thing`` or in this case, the ZMQ server. + """ return await self.pool[instance_name].async_recv_instruction() - async def async_recv_instructions(self, instance_name : str) -> Any: + async def async_recv_instructions(self, instance_name : str) -> typing.List[typing.List]: + """ + receive all available instructions for instance name + + Parameters + ---------- + instance_name : str + instance name of the ``Thing`` or in this case, the ZMQ server. + """ return await self.pool[instance_name].async_recv_instructions() - async def async_send_reply(self, instance_name : str, original_client_message : List[bytes], data : Any) -> None: + async def async_send_reply(self, *, instance_name : str, original_client_message : typing.List[bytes], + data : typing.Any) -> None: + """ + send reply for instance name + + Parameters + ---------- + instance_name : str + instance name of the ``Thing`` or in this case, the ZMQ server. + original_client_message: List[bytes] + instruction for which reply is being given + data: Any + data to be given as reply + """ await self.pool[instance_name].async_send_reply(original_client_message, data) - async def poll(self, strip_delimiter : bool = False) -> List[Any]: + async def poll(self) -> typing.List[typing.List[typing.Any]]: + """ + Pool for instruction in the entire server pool. Map the instruction to the correct instance + using the 0th index of the instruction. + """ self.stop_poll = False instructions = [] while not self.stop_poll: - sockets = await self.poller.poll(1) # type hints dont work in this line + sockets = await self.poller.poll(self._poll_timeout) for socket, _ in sockets: while True: try: @@ -403,231 +843,685 @@ async def poll(self, strip_delimiter : bool = False) -> List[Any]: break else: if instruction: + self.logger.debug(f"received instruction from client '{instruction[CM_INDEX_ADDRESS]}' with msg-ID {instruction[CM_INDEX_MESSAGE_ID]}") instructions.append(instruction) return instructions def stop_polling(self) -> None: + """ + stop polling method ``poll()`` + """ self.stop_poll = True + + def __getitem__(self, key) -> typing.Union[AsyncZMQServer, AsyncPollingZMQServer]: + return self.pool[key] + + def __iter__(self) -> typing.Iterator[str]: + return self.pool.__iter__() + + def __contains__(self, name : str) -> bool: + return name in self.pool.keys() def exit(self) -> None: - for server in self.pool.values(): - self.poller.unregister(server.socket) - server.exit() + for server in self.pool.values(): + try: + self.poller.unregister(server.socket) + server.exit() + except Exception as ex: + self.logger.warn(f"could not unregister poller and exit server {server.identity} - {str(ex)}") try: self.context.term() self.logger.info("context terminated for {}".format(self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context '{}'. Exception message : {}".format( - self.identity, str(E))) + except Exception as ex: + self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context " + + f"'{self.identity}'. Exception message : {str(ex)}") - def __getitem__(self, key) -> Union[AsyncZMQServer, AsyncPollingZMQServer]: - return self.pool[key] - def __iter__(self) -> Iterator[str]: - return self.pool.__iter__() - def __contains__(self, name : str) -> bool: - return name in self.pool.keys() +class RPCServer(BaseZMQServer): + """ + Top level ZMQ RPC server used by ``Thing`` and ``Eventloop``. + + Parameters + ---------- + instance_name: str + ``instance_name`` of the Thing which the server serves + server_type: str + server type metadata - currently not useful/important + context: Optional, zmq.asyncio.Context + ZeroMQ Context object to use. All sockets share this context. Automatically created when None is supplied. + protocols: List[str, Enum], default [ZMQ_PROTOCOLS.TCP, ZMQ_PROTOCOLS.IPC, ZMQ_PROTOCOLS.INPROC] + all ZeroMQ sockets where instructions can be passed to the RPC server. Use TCP for network access, + IPC for multi-process applications, and INPROC for multi-threaded applications. Use all for complete access. + poll_timeout: int, default 25 + time in milliseconds to poll the sockets specified under ``procotols``. Useful for calling ``stop_polling()`` + where the max delay to stop polling will be ``poll_timeout`` + """ + + def __init__(self, instance_name : str, *, server_type : Enum, context : typing.Union[zmq.asyncio.Context, None] = None, + protocols : typing.Union[ZMQ_PROTOCOLS, str, typing.List[ZMQ_PROTOCOLS]] = ZMQ_PROTOCOLS.IPC, + poll_timeout = 25, **kwargs) -> None: + super().__init__(instance_name=instance_name, server_type=server_type, **kwargs) + + self.identity = f"{instance_name}/rpc-server" + if isinstance(protocols, list): + protocols = protocols + elif isinstance(protocols, str): + protocols = [protocols] + else: + raise TypeError(f"unsupported protocols type : {type(protocols)}") + kwargs["json_serializer"] = self.json_serializer + kwargs["rpc_serializer"] = self.rpc_serializer + self.inproc_server = self.ipc_server = self.tcp_server = self.event_publisher = None + event_publisher_protocol = None + if self.logger is None: + self.logger = get_default_logger('{}|{}|{}|{}'.format(self.__class__.__name__, + 'RPC', 'MIXED', instance_name), kwargs.get('log_level', logging.INFO)) + # contexts and poller + self.context = context or zmq.asyncio.Context() + self.poller = zmq.asyncio.Poller() + self.poll_timeout = poll_timeout + # initialise every externally visible protocol + if ZMQ_PROTOCOLS.TCP in protocols or "TCP" in protocols: + self.tcp_server = AsyncPollingZMQServer(instance_name=instance_name, server_type=server_type, + context=self.context, protocol=ZMQ_PROTOCOLS.TCP, poll_timeout=poll_timeout, **kwargs) + self.poller.register(self.tcp_server.socket, zmq.POLLIN) + event_publisher_protocol = ZMQ_PROTOCOLS.TCP + if ZMQ_PROTOCOLS.IPC in protocols or "IPC" in protocols: + self.ipc_server = AsyncPollingZMQServer(instance_name=instance_name, server_type=server_type, + context=self.context, protocol=ZMQ_PROTOCOLS.IPC, poll_timeout=poll_timeout, **kwargs) + self.poller.register(self.ipc_server.socket, zmq.POLLIN) + event_publisher_protocol = ZMQ_PROTOCOLS.IPC if not event_publisher_protocol else event_publisher_protocol + if ZMQ_PROTOCOLS.INPROC in protocols or "INPROC" in protocols: + self.inproc_server = AsyncPollingZMQServer(instance_name=instance_name, server_type=server_type, + context=self.context, protocol=ZMQ_PROTOCOLS.INPROC, poll_timeout=poll_timeout, **kwargs) + self.poller.register(self.inproc_server.socket, zmq.POLLIN) + event_publisher_protocol = ZMQ_PROTOCOLS.INPROC if not event_publisher_protocol else event_publisher_protocol + self.event_publisher = EventPublisher( + instance_name=instance_name + '-event-pub', + protocol=event_publisher_protocol, + rpc_serializer=self.rpc_serializer, + json_serializer=self.json_serializer, + logger=self.logger + ) + # instruction serializing broker + self.inner_inproc_client = AsyncZMQClient( + server_instance_name=f'{instance_name}/inner', + identity=f'{instance_name}/tunneler', + client_type=TUNNELER, + context=self.context, + protocol=ZMQ_PROTOCOLS.INPROC, + handshake=False, # handshake manually done later when event loop is run + logger=self.logger + ) + self.inner_inproc_server = AsyncZMQServer( + instance_name=f'{self.instance_name}/inner', # hardcoded be very careful + server_type=server_type, + context=self.context, + protocol=ZMQ_PROTOCOLS.INPROC, + **kwargs + ) + self._instructions = deque() # type: deque[typing.Tuple[typing.List[bytes], asyncio.Event, asyncio.Future, zmq.Socket]] + self._instructions_event = asyncio.Event() + + + async def handshake_complete(self): + """ + handles inproc client's handshake with ``Thing``'s inproc server + """ + await self.inner_inproc_client.handshake_complete() + + def prepare(self): + """ + registers socket polling method and message tunnelling methods to the running + asyncio event loop + """ + eventloop = asyncio.get_event_loop() + eventloop.call_soon(lambda : asyncio.create_task(self.poll())) + eventloop.call_soon(lambda : asyncio.create_task(self.tunnel_message_to_things())) -class BaseZMQClient(BaseZMQ): - """ - Server to client: + @property + def poll_timeout(self) -> int: + """ + socket polling timeout in milliseconds greater than 0. + """ + return self._poll_timeout + + @poll_timeout.setter + def poll_timeout(self, value) -> None: + if not isinstance(value, int) or value < 0: + raise ValueError(("polling period must be an integer greater than 0, not {}.", + "Value is considered in milliseconds.".format(value))) + self._poll_timeout = value + + + def _get_timeout_from_instruction(self, message : typing.Tuple[bytes]) -> float: + """ + Unlike ``parse_client_message()``, this method only retrieves the timeout parameter + """ + client_type = message[CM_INDEX_CLIENT_TYPE] + if client_type == PROXY: + return self.rpc_serializer.loads(message[CM_INDEX_TIMEOUT]) + elif client_type == HTTP_SERVER: + return self.json_serializer.loads(message[CM_INDEX_TIMEOUT]) + + + async def poll(self): + """ + poll for instructions and append them to instructions list to pass them to ``Eventloop``/``Thing``'s inproc + server using an inner inproc client. Registers the messages for timeout calculation. + """ + self.stop_poll = False + eventloop = asyncio.get_event_loop() + self.inner_inproc_client.handshake() + await self.inner_inproc_client.handshake_complete() + if self.inproc_server: + eventloop.call_soon(lambda : asyncio.create_task(self.recv_instruction(self.inproc_server))) + if self.tcp_server: + eventloop.call_soon(lambda : asyncio.create_task(self.recv_instruction(self.tcp_server))) + if self.ipc_server: + eventloop.call_soon(lambda : asyncio.create_task(self.recv_instruction(self.ipc_server))) + + + async def recv_instruction(self, server : AsyncZMQServer): + eventloop = asyncio.get_event_loop() + socket = server.socket + while True: + try: + original_instruction = await socket.recv_multipart() + if original_instruction[CM_INDEX_MESSAGE_TYPE] == HANDSHAKE: + handshake_task = asyncio.create_task(self._handshake(original_instruction, socket)) + eventloop.call_soon(lambda : handshake_task) + continue + timeout = self._get_timeout_from_instruction(original_instruction) + ready_to_process_event = None + timeout_task = None + if timeout is not None: + ready_to_process_event = asyncio.Event() + timeout_task = asyncio.create_task(self.process_timeouts(original_instruction, + ready_to_process_event, timeout, socket)) + eventloop.call_soon(lambda : timeout_task) + except Exception as ex: + # handle invalid message + self.logger.error(f"exception occurred for message id {original_instruction[CM_INDEX_MESSAGE_ID]} - {str(ex)}") + invalid_message_task = asyncio.create_task(self._handle_invalid_message(original_instruction, + ex, socket)) + eventloop.call_soon(lambda: invalid_message_task) + else: + self._instructions.append((original_instruction, ready_to_process_event, + timeout_task, socket)) + self._instructions_event.set() + + + async def tunnel_message_to_things(self): + """ + message tunneler between external sockets and interal inproc client + """ + while not self.stop_poll: + if len(self._instructions) > 0: + message, ready_to_process_event, timeout_task, origin_socket = self._instructions.popleft() + timeout = True + if ready_to_process_event is not None: + ready_to_process_event.set() + timeout = await timeout_task + if ready_to_process_event is None or not timeout: + original_address = message[CM_INDEX_ADDRESS] + message[CM_INDEX_ADDRESS] = self.inner_inproc_client.server_address # replace address + await self.inner_inproc_client.socket.send_multipart(message) + reply = await self.inner_inproc_client.socket.recv_multipart() + reply[SM_INDEX_ADDRESS] = original_address + if reply[SM_INDEX_MESSAGE_TYPE] != ONEWAY: + await origin_socket.send_multipart(reply) + else: + await self._instructions_event.wait() + self._instructions_event.clear() + + + async def process_timeouts(self, original_client_message : typing.List, ready_to_process_event : asyncio.Event, + timeout : typing.Optional[float], origin_socket : zmq.Socket) -> bool: + """ + replies timeout to client if timeout occured and prevents the instruction from being executed. + """ + try: + await asyncio.wait_for(ready_to_process_event.wait(), timeout) + return False + except TimeoutError: + await origin_socket.send_multipart(self.craft_reply_from_arguments(original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], TIMEOUT, original_client_message[CM_INDEX_MESSAGE_ID])) + return True + + async def _handle_invalid_message(self, original_client_message: builtins.list[builtins.bytes], + exception: builtins.Exception, originating_socket : zmq.Socket) -> None: + await originating_socket.send_multipart(self.craft_reply_from_arguments( + original_client_message[CM_INDEX_ADDRESS], original_client_message[CM_INDEX_CLIENT_TYPE], + INVALID_MESSAGE, original_client_message[CM_INDEX_MESSAGE_ID], exception)) + self.logger.info(f"sent exception message to client '{original_client_message[CM_INDEX_ADDRESS]}'." + + f" exception - {str(exception)}") + + async def _handshake(self, original_client_message: builtins.list[builtins.bytes], + originating_socket : zmq.Socket) -> None: + await originating_socket.send_multipart(self.craft_reply_from_arguments( + original_client_message[CM_INDEX_ADDRESS], + original_client_message[CM_INDEX_CLIENT_TYPE], HANDSHAKE, original_client_message[CM_INDEX_MESSAGE_ID], + EMPTY_DICT)) + self.logger.info("sent handshake to client '{}'".format(original_client_message[CM_INDEX_ADDRESS])) - [address, bytes(), message_type, msg id, content] - Reply - [client address, bytes, REPLY, msg id, content] + def exit(self): + self.stop_poll = True + for socket in list(self.poller._map.keys()): # iterating over keys will cause dictionary size change during iteration + try: + self.poller.unregister(socket) + except Exception as ex: + self.logger.warn(f"could not unregister socket from polling - {str(ex)}") # does not give info about socket + try: + self.inproc_server.exit() + self.ipc_server.exit() + self.tcp_server.exit() + self.inner_inproc_client.exit() + except: + pass + self.context.term() + self.logger.info("terminated context of socket '{}' of type '{}'".format(self.identity, self.__class__)) - Execution Context Definitions: - "plain_reply" - "fetch_execution_logs" + + +class BaseZMQClient(BaseZMQ): + """ + Base class for all ZMQ clients irrespective of sync and async. + + server's reply to client + :: + + [address, bytes(), server type , message_type, message id, content or response or reply] + [ 0 , 1 , 2 , 3 , 4 , 5 ] + + Parameters + ---------- + server_instance_name: str + The instance name of the server (or ``Thing``) + client_type: str + RPC or HTTP Server + **kwargs: + rpc_serializer: BaseSerializer + custom implementation of RPC serializer if necessary + json_serializer: JSONSerializer + custom implementation of JSON serializer if necessary """ - def __init__(self, server_address : Union[bytes, None], server_instance_name : Union[str, None], - client_type : bytes, **kwargs) -> None: - if client_type in [PROXY, HTTP_SERVER]: + def __init__(self, *, + server_instance_name : str, client_type : bytes, + server_type : typing.Union[bytes, str, Enum] = ServerTypes.UNKNOWN_TYPE, + json_serializer : typing.Union[None, JSONSerializer] = None, + rpc_serializer : typing.Union[str, BaseSerializer, None] = None, + logger : typing.Optional[logging.Logger] = None, + **kwargs + ) -> None: + if client_type in [PROXY, HTTP_SERVER, TUNNELER]: self.client_type = client_type else: - raise ValueError("invalid client type for {}. Given option {}".format(self.__class__, client_type)) - self.proxy_serializer = None - self.json_serializer = None - if client_type == HTTP_SERVER: - json_serializer = kwargs.get("json_serializer", None) - if json_serializer is None or isinstance(json_serializer, JSONSerializer): - self.json_serializer = json_serializer or JSONSerializer() - else: - raise ValueError("invalid JSON serializer option for {}. Given option {}".format(self.__class__, json_serializer)) - else: - proxy_serializer = kwargs.get("proxy_serializer", None) - if proxy_serializer is None or isinstance(proxy_serializer, (PickleSerializer, SerpentSerializer, - JSONSerializer)):#, DillSerializer)): - self.proxy_serializer = proxy_serializer or SerpentSerializer() - elif isinstance(proxy_serializer, str) and proxy_serializer in serializers.keys(): - self.proxy_serializer = serializers[proxy_serializer]() - else: - raise ValueError("invalid proxy serializer option for {}. Given option {}".format(self.__class__, proxy_serializer)) - self.server_address = server_address - self.server_instance_name = server_instance_name - self.server_type = ServerTypes.UNKNOWN_TYPE + raise ValueError("Invalid client type for {}. Given option {}.".format(self.__class__, client_type)) + if server_instance_name: + self.server_address = bytes(server_instance_name, encoding='utf-8') + self.instance_name = server_instance_name + self.rpc_serializer, self.json_serializer = _get_serializer_from_user_given_options( + rpc_serializer=rpc_serializer, + json_serializer=json_serializer + ) + if isinstance(server_type, bytes): + self.server_type = server_type + elif isinstance(server_type, Enum): + self.server_type = server_type.value + else: + self.server_type = bytes(server_type, encoding='utf-8') + self.logger = logger + self._monitor_socket = None + self._reply_cache = dict() super().__init__() - def parse_server_message(self, message : List[bytes]) -> Any: + def raise_local_exception(self, exception : typing.Dict[str, typing.Any]) -> None: + """ + raises an exception on client side using an exception from server by mapping it to the correct one based on type. + + Parameters + ---------- + exception: Dict[str, Any] + exception dictionary made by server with following keys - type, message, traceback, notes + """ - Server to client: + if isinstance(exception, Exception): + raise exception from None + exc = getattr(builtins, exception["type"], None) + message = exception["message"] + if exc is None: + ex = Exception(message) + else: + ex = exc(message) + exception["traceback"][0] = f"Server {exception['traceback'][0]}" + ex.__notes__ = exception["traceback"][0:-1] + raise ex from None + - [address, bytes(), message_type, msg id, content or response or reply] - [0 1 , 2 , 3 , 4 ] + def parse_server_message(self, message : typing.List[bytes], raise_client_side_exception : bool = False, + deserialize : bool = True) -> typing.Any: """ - message_type = message[3] + server's message to client: + + :: + [address, bytes(), server type , message_type, message id, content or response or reply] + [ 0 , 1 , 2 , 3 , 4 , 5 ] + + Parameters + ---------- + message: List[bytes] + message sent be server + raise_client_side_exception: bool + raises exception from server on client + + Raises + ------ + NotImplementedError: + if message type is not reply, handshake or invalid + """ + if len(message) == 2: # socket monitor message, not our message + try: + if ZMQ_EVENT_MAP[parse_monitor_message(message)['event']] == SERVER_DISCONNECTED: + raise ConnectionAbortedError(f"server disconnected for {self.instance_name}") + return None # None should simply continue the message receive logic + except RuntimeError as ex: + raise RuntimeError(f'message received from monitor socket cannot be deserialized for {self.instance_name}') from None + message_type = message[SM_INDEX_MESSAGE_TYPE] if message_type == REPLY: - if self.client_type == HTTP_SERVER: - message[5] = self.json_serializer.loads(message[5]) # type: ignore - elif self.client_type == PROXY: - message[5] = self.proxy_serializer.loads(message[5]) # type: ignore + if deserialize: + if self.client_type == HTTP_SERVER: + message[SM_INDEX_DATA] = self.json_serializer.loads(message[SM_INDEX_DATA]) # type: ignore + elif self.client_type == PROXY: + message[SM_INDEX_DATA] = self.rpc_serializer.loads(message[SM_INDEX_DATA]) # type: ignore return message elif message_type == HANDSHAKE: self.logger.debug("""handshake messages arriving out of order are silently dropped as receiving this message - means handshake was successful before. Received hanshake from {}""".format(message[0])) - elif message_type == INVALID_MESSAGE: - raise Exception("Invalid message sent") - elif message_type == EVENT_SUBSCRIPTION: - return message[0], message[3], message[4] + means handshake was successful before. Received hanshake from {}""".format(message[0])) + elif message_type == EXCEPTION or message_type == INVALID_MESSAGE: + if self.client_type == HTTP_SERVER: + message[SM_INDEX_DATA] = self.json_serializer.loads(message[SM_INDEX_DATA]) # type: ignore + elif self.client_type == PROXY: + message[SM_INDEX_DATA] = self.rpc_serializer.loads(message[SM_INDEX_DATA]) # type: ignore + if not raise_client_side_exception: + return message + if message[SM_INDEX_DATA].get('exception', None) is not None: + self.raise_local_exception(message[SM_INDEX_DATA]['exception']) + else: + raise NotImplementedError("message type {} received. No exception field found, exception field mandatory.".format( + message_type)) + elif message_type == TIMEOUT: + exception = TimeoutError("message timed out.") + if raise_client_side_exception: + raise exception from None + message[SM_INDEX_DATA] = format_exception_as_json(exception) + return message else: raise NotImplementedError("Unknown message type {} received. This message cannot be dealt.".format(message_type)) - def craft_instruction_from_arguments(self, instruction : str, arguments : Dict[str, Any], # type: ignore - context : Dict[str, Any]) -> List[bytes]: # type: ignore + + def craft_instruction_from_arguments(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + timeout : typing.Optional[float] = None, context : typing.Dict[str, typing.Any] = EMPTY_DICT) -> typing.List[bytes]: """ - message from client to server : + message from client to server: - [address, bytes(), client type, message type, msg id, instruction, arguments] - [ 0 , 1 , 2 , 3 , 4 , 5 , 6 ] + :: + [address, bytes(), client type, message type, message id, instruction, arguments] + [ 0 , 1 , 2 , 3 , 4 , 5 , 6 ] - Handshake - [client address, bytes, client_type, HANDSHAKE] """ - message_id = bytes(str(hash(instruction + current_datetime_ms_str())), encoding='utf-8') + message_id = bytes(str(uuid4()), encoding='utf-8') if self.client_type == HTTP_SERVER: - instruction : bytes = self.json_serializer.dumps(instruction) - context : bytes = self.json_serializer.dumps(context) + timeout = self.json_serializer.dumps(timeout) # type: bytes + instruction = self.json_serializer.dumps(instruction) # type: bytes + # TODO - following can be improved if arguments == b'': - arguments : bytes = self.json_serializer.dumps({}) - elif not isinstance(arguments, bytes): - arguments : bytes = self.json_serializer.dumps(arguments) - else: - instruction : bytes = self.proxy_serializer.dumps(instruction) - context : bytes = self.proxy_serializer.dumps(context) - arguments : bytes = self.proxy_serializer.dumps(arguments) + arguments = self.json_serializer.dumps({}) # type: bytes + elif not isinstance(arguments, byte_types): + arguments = self.json_serializer.dumps(arguments) # type: bytes + context = self.json_serializer.dumps(context) # type: bytes + elif self.client_type == PROXY: + timeout = self.rpc_serializer.dumps(timeout) # type: bytes + instruction = self.rpc_serializer.dumps(instruction) # type: bytes + if not isinstance(arguments, byte_types): + arguments = self.rpc_serializer.dumps(arguments) # type: bytes + context = self.rpc_serializer.dumps(context) + return [ self.server_address, EMPTY_BYTE, self.client_type, INSTRUCTION, message_id, + timeout, instruction, arguments, context ] - def craft_message(self, message_type : bytes): + + def craft_empty_message_with_type(self, message_type : bytes = HANDSHAKE): + """ + create handshake message for example + """ return [ - self.server_address, + self.server_address, EMPTY_BYTE, self.client_type, message_type, EMPTY_BYTE, EMPTY_BYTE, EMPTY_BYTE, + EMPTY_BYTE, EMPTY_BYTE ] - - + + def exit(self) -> None: + BaseZMQ.exit(self) + try: + self.poller.unregister(self.socket) + if self._monitor_socket is not None: + self.poller.unregister(self._monitor_socket) + except Exception as ex: + self.logger.warn(f"unable to deregister from poller - {str(ex)}") + + try: + if self._monitor_socket is not None: + self._monitor_socket.close(0) + self.socket.close(0) + self.logger.info("terminated socket of server '{}' of type '{}'".format(self.identity, self.__class__)) + except Exception as ex: + self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated " + + f"socket '{self.identity}' of type '{self.__class__}'. Exception message : {str(ex)}") + try: + if self._terminate_context: + self.context.term() + self.logger.info("terminated context of socket '{}' of type '{}'".format(self.identity, self.__class__)) + except Exception as ex: + self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context" + + "'{}'. Exception message : {}".format(self.identity, str(ex))) + + + class SyncZMQClient(BaseZMQClient, BaseSyncZMQ): + """ + Synchronous ZMQ client that connect with sync or async server based on ZMQ protocol. Works like REQ-REP socket. + Each request is blocking until response is received. Suitable for most purposes. + + Parameters + ---------- + server_instance_name: str + The instance name of the server (or ``Thing``) + identity: str + Unique identity of the client to receive messages from the server. Each client connecting to same server must + still have unique ID. + client_type: str + RPC or HTTP Server + handshake: bool + when true, handshake with the server first before allowing first message and block until that handshake was + accomplished. + protocol: str | Enum, TCP, IPC or INPROC, default IPC + protocol implemented by the server + **kwargs: + socket_address: str + socket address for connecting to TCP server + rpc_serializer: + custom implementation of RPC serializer if necessary + json_serializer: + custom implementation of JSON serializer if necessary + """ def __init__(self, server_instance_name : str, identity : str, client_type = HTTP_SERVER, - handshake : bool = True, protocol : str = "IPC", context : Union[zmq.asyncio.Context, None] = None, - **serializer) -> None: - BaseZMQClient.__init__(self, server_address = bytes(server_instance_name, encoding='utf-8'), - server_instance_name = server_instance_name, client_type = client_type, **serializer) + handshake : bool = True, protocol : str = ZMQ_PROTOCOLS.IPC, + context : typing.Union[zmq.asyncio.Context, None] = None, + **kwargs) -> None: + BaseZMQClient.__init__(self, server_instance_name=server_instance_name, + client_type=client_type, **kwargs) BaseSyncZMQ.__init__(self) - self.create_socket(context or zmq.Context(), server_instance_name, identity) + self.create_socket(identity=identity, context=context, protocol=protocol, **kwargs) + self.poller = zmq.Poller() + self.poller.register(self.socket, zmq.POLLIN) self._terminate_context = context == None if handshake: - self.handshake() + self.handshake(kwargs.pop("handshake_timeout", 60000)) - def send_instruction(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, context : Dict[str, Any] = EMPTY_DICT) -> bytes: - message = self.craft_instruction_from_arguments(instruction, arguments, context) + def send_instruction(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = None, execution_timeout : typing.Optional[float] = None, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, + argument_schema : typing.Optional[JSON] = None) -> bytes: + """ + send message to server. + + client's message to server: + :: + + [address, bytes(), client type, message type, messsage id, + [ 0 , 1 , 2 , 3 , 4 , + + timeout, instruction, arguments, execution context] + 5 , 6 , 7 , 8 ] + + Execution Context Definitions (typing.Dict[str, typing.Any] or JSON): + - "plain_reply" - does not return state + - "fetch_execution_logs" - fetches logs that were accumulated while execution + + Parameters + ---------- + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + + Returns + ------- + message id : bytes + a byte representation of message id + """ + message = self.craft_instruction_from_arguments(instruction, arguments, invokation_timeout, context) + # if argument_schema: + # jsonschema.validate(arguments, argument_schema) self.socket.send_multipart(message) - self.logger.debug("sent instruction '{}' to server '{}' with msg-id {}".format(instruction, self.server_instance_name, - message[4])) - return message[4] + self.logger.debug(f"sent instruction '{instruction}' to server '{self.instance_name}' with msg-id '{message[SM_INDEX_MESSAGE_ID]}'") + return message[SM_INDEX_MESSAGE_ID] - def recv_reply(self, raise_client_side_exception : bool = False) -> Sequence[Union[bytes, Dict[str, Any]]]: + def recv_reply(self, message_id : bytes, timeout : typing.Optional[int] = None, raise_client_side_exception : bool = False, + deserialize : bool = True) -> typing.List[typing.Union[bytes, typing.Dict[str, typing.Any]]]: + """ + Receives reply from server. Messages are identified by message id, so call this method immediately after + calling ``send_instruction()`` to avoid receiving messages out of order. Or, use other methods like + ``execute()``, ``read_attribute()`` or ``write_attribute()``. + + Parameters + ---------- + raise_client_side_exception: bool, default False + if True, any exceptions raised during execution inside ``Thing`` instance will be raised on the client. + See docs of ``raise_local_exception()`` for info on exception + """ while True: - reply = self.parse_server_message(self.socket.recv_multipart()) # type: ignore - if reply: - self.logger.debug("received reply with msg-id {}".format(reply[3])) - if reply[5].get('exception', None) is not None and raise_client_side_exception: - exc_info = reply[5]['exception'] - raise Exception("traceback : {},\nmessage : {},\ntype : {}".format('\n'.join(exc_info["traceback"]), - exc_info['message'], exc_info["type"])) + sockets = self.poller.poll(timeout) + reply = None + for socket, _ in sockets: + try: + message = socket.recv_multipart(zmq.NOBLOCK) + reply = self.parse_server_message(message, raise_client_side_exception, deserialize) # type: ignore + except zmq.Again: + pass + if reply: + if message_id != reply[SM_INDEX_MESSAGE_ID]: + self._reply_cache[message_id] = reply + continue + self.logger.debug("received reply with msg-id {}".format(reply[SM_INDEX_MESSAGE_ID])) return reply + if timeout is not None: + break # this should not break, technically an error, should be fixed when inventing better RPC contract - def execute(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception : bool = False) -> Any: - self.send_instruction(instruction, arguments, context) - return self.recv_reply(raise_client_side_exception) - - def read_attribute(self, attribute_url : str, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception : bool = False) -> Any: - return self.execute(attribute_url+'/read', EMPTY_DICT, context, raise_client_side_exception) + def execute(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = None, execution_timeout : typing.Optional[float] = None, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, raise_client_side_exception : bool = False, + argument_schema : typing.Optional[JSON] = None, + deserialize_reply : bool = True) -> typing.List[typing.Union[bytes, typing.Dict[str, typing.Any]]]: + """ + send an instruction and receive the reply for it. + + Parameters + ---------- + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + + Returns + ------- + message id : bytes + a byte representation of message id + """ + msg_id = self.send_instruction(instruction, arguments, invokation_timeout, execution_timeout, context, argument_schema) + return self.recv_reply(msg_id, raise_client_side_exception=raise_client_side_exception, deserialize=deserialize_reply) - def write_attribute(self, attribute_url : str, value : Any, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception : bool = False) -> Any: - return self.execute(attribute_url+'/read', {"value" : value}, context, raise_client_side_exception) - def handshake(self) -> None: - poller = zmq.Poller() - poller.register(self.socket, zmq.POLLIN) + def handshake(self, timeout : typing.Union[float, int] = 60000) -> None: + """ + hanshake with server before sending first message + """ + start_time = time.time_ns() while True: - self.socket.send_multipart(self.craft_message(HANDSHAKE)) - self.logger.debug("sent Handshake to server '{}'".format(self.server_instance_name)) - if poller.poll(500): + if timeout is not None and (time.time_ns() - start_time)/1e6 > timeout: + raise ConnectionError(f"Unable to contact server '{self.instance_name}' from client '{self.identity}'") + self.socket.send_multipart(self.craft_empty_message_with_type(HANDSHAKE)) + self.logger.info(f"sent Handshake to server '{self.instance_name}'") + if self.poller.poll(500): try: message = self.socket.recv_multipart(zmq.NOBLOCK) except zmq.Again: pass else: if message[3] == HANDSHAKE: # type: ignore - self.logger.info("client '{}' handshook with server '{}'".format(self.identity, - self.server_instance_name)) - self.server_type = ServerTypes._value2member_map_[message[2]] # type: ignore + self.logger.info(f"client '{self.identity}' handshook with server '{self.instance_name}'") + self.server_type = message[SM_INDEX_SERVER_TYPE] break else: - raise ValueError('Handshake cannot be done. Another message arrived before handshake complete.') - poller.unregister(self.socket) - del poller - - def exit(self) -> None: - try: - self.socket.close(0) - self.logger.info("terminated socket of server '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated socket '{}'of type '{}'. Exception message : {}".format( - self.identity, self.__class__, str(E))) - try: - if self._terminate_context: - self.context.term() - self.logger.info("terminated context of socket '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context '{}'. Exception message : {}".format( - self.identity, str(E))) - - + raise ConnectionAbortedError(f"Handshake cannot be done with '{self.instance_name}'. Another message arrived before handshake complete.") + else: + self.logger.info('got no reply') + self._monitor_socket = self.socket.get_monitor_socket() + self.poller.register(self._monitor_socket, zmq.POLLIN) + # sufficient to know when server dies only while receiving messages, not continuous polling + + class AsyncZMQClient(BaseZMQClient, BaseAsyncZMQ): - """ Asynchronous client to talk to a ZMQ server where the server is identified by the instance name. The identity of the client needs to be different from the server, unlike the ZMQ Server. The client will also perform handshakes @@ -635,164 +1529,228 @@ class AsyncZMQClient(BaseZMQClient, BaseAsyncZMQ): """ def __init__(self, server_instance_name : str, identity : str, client_type = HTTP_SERVER, - handshake : bool = True, protocol : str = "IPC", context : Union[zmq.asyncio.Context, None] = None, - **serializer) -> None: - BaseZMQClient.__init__(self, server_address = bytes(server_instance_name, encoding='utf-8'), - server_instance_name = server_instance_name, client_type = client_type, **serializer) + handshake : bool = True, protocol : str = "IPC", context : typing.Union[zmq.asyncio.Context, None] = None, + **kwargs) -> None: + BaseZMQClient.__init__(self, server_instance_name=server_instance_name, client_type=client_type, **kwargs) BaseAsyncZMQ.__init__(self) - self.create_socket(context, server_instance_name, identity) + self.create_socket(context=context, identity=identity, protocol=protocol, **kwargs) + self.poller = zmq.asyncio.Poller() + self.poller.register(self.socket, zmq.POLLIN) self._terminate_context = context == None - self.handshake_event = asyncio.Event() + self._handshake_event = asyncio.Event() + self._handshake_event.clear() if handshake: - self.handshake() + self.handshake(kwargs.pop("handshake_timeout", 60000)) - def handshake(self) -> None: - run_coro_somehow(self._handshake()) + def handshake(self, timeout : typing.Optional[int] = 60000) -> None: + """ + automatically called when handshake argument at init is True. When not automatically called, it is necessary + to call this method before awaiting ``handshake_complete()``. + """ + run_callable_somehow(self._handshake(timeout)) - async def _handshake(self) -> None: - self.handshake_event.clear() - poller = zmq.asyncio.Poller() - poller.register(self.socket, zmq.POLLIN) + async def _handshake(self, timeout : typing.Union[float, int] = 60000) -> None: + """ + hanshake with server before sending first message + """ + if self._monitor_socket is not None and self._monitor_socket in self.poller: + self.poller.unregister(self._monitor_socket) + self._handshake_event.clear() + start_time = time.time_ns() while True: - await self.socket.send_multipart(self.craft_message(HANDSHAKE)) - self.logger.debug("sent Handshake to server '{}'".format(self.server_instance_name)) - if await poller.poll(500): + if timeout is not None and (time.time_ns() - start_time)/1e6 > timeout: + raise ConnectionError(f"Unable to contact server '{self.instance_name}' from client '{self.identity}'") + await self.socket.send_multipart(self.craft_empty_message_with_type(HANDSHAKE)) + self.logger.info(f"sent Handshake to server '{self.instance_name}'") + if await self.poller.poll(500): try: - msg = await self.socket.recv_multipart(zmq.NOBLOCK) + message = await self.socket.recv_multipart(zmq.NOBLOCK) except zmq.Again: pass else: - if msg[3] == HANDSHAKE: - self.logger.info("client '{}' handshook with server '{}'".format(self.identity,self.server_instance_name)) - self.server_type = ServerTypes._value2member_map_[msg[2]] + if message[3] == HANDSHAKE: # type: ignore + self.logger.info(f"client '{self.identity}' handshook with server '{self.instance_name}'") + self.server_type = message[SM_INDEX_SERVER_TYPE] break else: - self.logger.info("handshake cannot be done with server '{}'. another message arrived before handshake complete.".format(self.server_instance_name)) - self.handshake_event.set() - poller.unregister(self.socket) - del poller - + raise ConnectionAbortedError(f"Handshake cannot be done with '{self.instance_name}'. Another message arrived before handshake complete.") + else: + self.logger.info('got no reply') + self._handshake_event.set() + self._monitor_socket = self.socket.get_monitor_socket() + self.poller.register(self._monitor_socket, zmq.POLLIN) + async def handshake_complete(self): - await self.handshake_event.wait() + """ + wait for handshake to complete + """ + await self._handshake_event.wait() - async def async_send_instruction(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT) -> bytes: - message = self.craft_instruction_from_arguments(instruction, arguments, context) + async def async_send_instruction(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = None, execution_timeout : typing.Optional[float] = None, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, + argument_schema : typing.Optional[JSON] = None) -> bytes: + """ + send message to server. + + client's message to server: + :: + [address, bytes(), client type, message type, messsage id, + [ 0 , 1 , 2 , 3 , 4 , + + timeout, instruction, arguments, execution context] + 5 , 6 , 7 , 8 ] + + Execution Context Definitions (typing.Dict[str, typing.Any] or JSON): + - "plain_reply" - does not return state + - "fetch_execution_logs" - fetches logs that were accumulated while execution + + Parameters + ---------- + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + + Returns + ------- + message id : bytes + a byte representation of message id + """ + message = self.craft_instruction_from_arguments(instruction, arguments, invokation_timeout, context) + # if argument_schema: + # jsonschema.validate(arguments, argument_schema) await self.socket.send_multipart(message) - self.logger.debug("sent instruction '{}' to server '{}' with msg-id {}".format(instruction, self.server_instance_name, - message[3])) - return message[4] + self.logger.debug(f"sent instruction '{instruction}' to server '{self.instance_name}' with msg-id {message[SM_INDEX_MESSAGE_ID]}") + return message[SM_INDEX_MESSAGE_ID] - async def async_recv_reply(self, raise_client_side_exception) -> Sequence[Union[bytes, Dict[str, Any]]]: + async def async_recv_reply(self, message_id : bytes, timeout : typing.Optional[int] = None, + raise_client_side_exception : bool = False, deserialize : bool = True) -> typing.List[ + typing.Union[bytes, typing.Dict[str, typing.Any]]]: + """ + Receives reply from server. Messages are identified by message id, so call this method immediately after + calling ``send_instruction()`` to avoid receiving messages out of order. Or, use other methods like + ``execute()``, ``read_attribute()`` or ``write_attribute()``. + + Parameters + ---------- + raise_client_side_exception: bool, default False + if True, any exceptions raised during execution inside ``Thing`` instance will be raised on the client. + See docs of ``raise_local_exception()`` for info on exception + """ while True: - reply = self.parse_server_message(await self.socket.recv_multipart())# [2] # type: ignore + sockets = await self.poller.poll(timeout) + reply = None + for socket, _ in sockets: + try: + message = await socket.recv_multipart(zmq.NOBLOCK) + reply = self.parse_server_message(message, raise_client_side_exception, deserialize) # type: ignore + except zmq.Again: + pass if reply: - self.logger.debug("received reply with message-id {}".format(reply[3])) - if reply[5].get('exception', None) is not None and raise_client_side_exception: - exc_info = reply[5]['exception'] - raise Exception("traceback : {},\nmessage : {},\ntype : {}".format('\n'.join(exc_info["traceback"]), - exc_info['message'], exc_info["type"])) + if message_id != reply[SM_INDEX_MESSAGE_ID]: + self._reply_cache[message_id] = reply + continue + self.logger.debug(f"received reply with message-id '{reply[SM_INDEX_MESSAGE_ID]}'") return reply + if timeout is not None: + break - async def async_execute(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception = False): - await self.async_send_instruction(instruction, arguments, context) - return await self.async_recv_reply(raise_client_side_exception) - - async def read_attribute(self, attribute_url : str, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception : bool = False) -> Any: - return await self.async_execute(attribute_url+'/read', EMPTY_DICT, context, raise_client_side_exception) - - async def write_attribute(self, attribute_url : str, value : Any, context : Dict[str, Any] = EMPTY_DICT, - raise_client_side_exception : bool = False) -> Any: - return self.async_execute(attribute_url+'/write', {"value" : value}, context, raise_client_side_exception) - - def exit(self) -> None: - try: - self.socket.close(0) - self.logger.info("terminated socket of server '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated socket '{}'of type '{}'. Exception message : {}".format( - self.identity, self.__class__, str(E))) - try: - if self._terminate_context: - self.context.term() - self.logger.info("terminated context of socket '{}' of type '{}'".format(self.identity, self.__class__)) - except Exception as E: - self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context '{}'. Exception message : {}".format( - self.identity, str(E))) - - - -class AsyncZMQClientPool(BaseZMQClient): - - def __init__(self, server_instance_names : List[str], identity : str, client_type = HTTP_SERVER, - protocol : str = 'IPC', **serializer) -> None: - self.identity = identity - self.context = zmq.asyncio.Context() - self.pool : Dict[str, AsyncZMQClient] = dict() - for instance_name in server_instance_names: - self.pool[instance_name] = AsyncZMQClient(server_instance_name = instance_name, - identity = identity, client_type = client_type, handshake = True, protocol = protocol, - context = self.context, **serializer) - self.poller = zmq.asyncio.Poller() - for client in self.pool.values(): - self.poller.register(client.socket, zmq.POLLIN) - self.logger = self.get_logger(identity, 'pooled', logging.DEBUG) - # Both the client pool as well as the individual client get their serializers and client_types - # This is required to implement pool level sending and receiving messages like polling of pool of sockets - super().__init__(server_address = None, server_instance_name = None, client_type = client_type, **serializer) - - async def poll(self) -> None : - raise NotImplementedError("implement poll function for AsyncZMQClientPool subclass {}".format(self.__class__)) - - def register_client(self, instance_name : str, protocol : str = 'IPC'): - if instance_name not in self.pool.keys(): - self.pool[instance_name] = AsyncZMQClient(server_instance_name = instance_name, - identity = self.identity, client_type = self.client_type, handshake = True, protocol = protocol, - context = self.context, proxy_serializer = self.proxy_serializer, json_serializer = self.json_serializer) - else: - raise ValueError("client already present in pool") - - def __contains__(self, name : str) -> bool: - return name in self.pool - - def __getitem__(self, key) ->AsyncZMQClient: - return self.pool[key] - - def __iter__(self) -> Iterator[AsyncZMQClient]: - return iter(self.pool.values()) - - def exit(self) -> None: - for client in self.pool.values(): - self.poller.unregister(client.socket) - client.exit() - self.logger.info("all client socket unregistered from pool for '{}'".format(self.__class__)) - try: - self.context.term() - self.logger.info("context terminated for '{}'".format(self.__class__)) - except: - pass + async def async_execute(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = None, execution_timeout : typing.Optional[float] = None, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, + raise_client_side_exception = False, argument_schema : typing.Optional[JSON] = None, + deserialize_reply : bool = True) -> typing.List[typing.Union[bytes, typing.Dict[str, typing.Any]]]: + """ + send an instruction and receive the reply for it. + + Parameters + ---------- + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + + Returns + ------- + message id : bytes + a byte representation of message id + """ + msg_id = await self.async_send_instruction(instruction, arguments, invokation_timeout, execution_timeout, + context, argument_schema) + return await self.async_recv_reply(msg_id, raise_client_side_exception=raise_client_side_exception, + deserialize=deserialize_reply) -class MessageMappedZMQClientPool(AsyncZMQClientPool): +class MessageMappedZMQClientPool(BaseZMQClient): """ Pool of clients where message ID can track the replies irrespective of order of arrival. """ - def __init__(self, instance_names: List[str], identity: str, poll_timeout = 25, protocol : str = 'IPC', - client_type = HTTP_SERVER, **serializer) -> None: - super().__init__(instance_names, identity, client_type = client_type, protocol = protocol, **serializer) - self.event_pool = EventPool(len(instance_names)) - self.message_to_event_map : Dict[bytes, asyncio.Event] = dict() - self.shared_message_map = dict() + def __init__(self, server_instance_names: typing.List[str], identity: str, client_type = HTTP_SERVER, + handshake : bool = True, poll_timeout = 25, protocol : str = 'IPC', + context : zmq.asyncio.Context = None, deserialize_server_messages : bool= True, + **kwargs) -> None: + super().__init__(server_instance_name='pool', client_type=client_type, **kwargs) + self.identity = identity + self.logger = kwargs.get('logger', get_default_logger('{}|{}'.format(identity, 'pooled'), logging.INFO)) + # this class does not call create_socket method + self.context = context or zmq.asyncio.Context() + self.pool = dict() # type: typing.Dict[str, AsyncZMQClient] + self.poller = zmq.asyncio.Poller() + for instance_name in server_instance_names: + client = AsyncZMQClient(server_instance_name=instance_name, + identity=identity, client_type=client_type, handshake=handshake, protocol=protocol, + context=self.context, rpc_serializer=self.rpc_serializer, json_serializer=self.json_serializer, + logger=self.logger) + client._monitor_socket = client.socket.get_monitor_socket() + self.poller.register(client._monitor_socket, zmq.POLLIN) + self.pool[instance_name] = client + # Both the client pool as well as the individual client get their serializers and client_types + # This is required to implement pool level sending and receiving messages like polling of pool of sockets + self.event_pool = AsyncioEventPool(len(server_instance_names)) + self.events_map = dict() # type: typing.Dict[bytes, asyncio.Event] + self.message_map = dict() + self.cancelled_messages = [] self.poll_timeout = poll_timeout self.stop_poll = False - self.cancelled_messages = [] + self._deserialize_server_messages = deserialize_server_messages + + def create_new(self, server_instance_name : str, protocol : str = 'IPC') -> None: + """ + Create new server with specified protocol. other arguments are taken from pool specifications. + + Parameters + ---------- + instance_name: str + instance name of server + protocol: str + protocol implemented by ZMQ server + """ + if server_instance_name not in self.pool.keys(): + client = AsyncZMQClient(server_instance_name=server_instance_name, + identity=self.identity, client_type=self.client_type, handshake=True, protocol=protocol, + context=self.context, rpc_serializer=self.rpc_serializer, json_serializer=self.json_serializer, + logger=self.logger) + client._monitor_socket = client.socket.get_monitor_socket() + self.poller.register(client._monitor_socket, zmq.POLLIN) + self.pool[server_instance_name] = client + else: + raise ValueError(f"client for instance name '{server_instance_name}' already present in pool") + @property def poll_timeout(self) -> int: + """ + socket polling timeout in milliseconds greater than 0. + """ return self._poll_timeout @poll_timeout.setter @@ -801,7 +1759,12 @@ def poll_timeout(self, value) -> None: raise ValueError("polling period must be an integer greater than 0, not {}. Value is considered in milliseconds".format(value)) self._poll_timeout = value + async def poll(self) -> None: + """ + Poll for replies from server. Since the client is message mapped, this method should be independently started + in the event loop. Sending message and retrieving a message mapped is still carried out by other methods. + """ self.logger.info("client polling started for sockets for {}".format(list(self.pool.keys()))) self.stop_poll = False event_loop = asyncio.get_event_loop() @@ -810,33 +1773,60 @@ async def poll(self) -> None: for socket, _ in sockets: while True: try: - reply = self.parse_server_message(await socket.recv_multipart(zmq.NOBLOCK)) + reply = self.parse_server_message(await socket.recv_multipart(zmq.NOBLOCK), + deserialize=self._deserialize_server_messages) + if not reply: + continue except zmq.Again: + # errors in handle_message should reach the client. break - """ - errors in handle_message should reach the client. - """ - if reply: - address, _, server_type, message_type, message_id, response = reply - self.logger.debug("received reply from server '{}' with message ID {}".format(address, message_id)) + except ConnectionAbortedError: + for client in self.pool.values(): + if client.socket.get_monitor_socket() == socket: + self.poller.unregister(client.socket) # leave the monitor in the pool + client.handshake(timeout=None) + self.logger.error(f"{client.instance_name} disconnected." + + " Unregistering from poller temporarily until server comes back.") + break + else: + address, _, server_type, message_type, message_id, data, encoded_data = reply + self.logger.debug(f"received reply from server '{address}' with message ID '{message_id}'") if message_id in self.cancelled_messages: self.cancelled_messages.remove(message_id) - self.logger.debug(f'message_id {message_id} cancelled') + self.logger.debug(f"message_id '{message_id}' cancelled") continue - try: - event = self.message_to_event_map[message_id] # type: ignore - except KeyError: - event_loop.call_soon(lambda: asyncio.create_task(self.resolve_reply(message_id, response))) - else: - self.shared_message_map[message_id] = response + event = self.events_map.get(message_id, None) + if event: + if len(encoded_data) > 0: + self.message_map[message_id] = encoded_data + else: + self.message_map[message_id] = data event.set() + else: + if len(encoded_data) > 0: + invalid_event_task = asyncio.create_task(self._resolve_reply(message_id, encoded_data)) + else: + invalid_event_task = asyncio.create_task(self._resolve_reply(message_id, data)) + event_loop.call_soon(lambda: invalid_event_task) + - async def resolve_reply(self, message_id, return_value): + async def _resolve_reply(self, message_id : bytes, data : typing.Any) -> None: + """ + This method is called when there is an asyncio Event not available for a message ID. This can happen only + when the server replied before the client created a asyncio.Event object. check ``async_execute()`` for details. + + Parameters + ---------- + message_id: bytes + the message for which the event was not created + data: bytes + the data given by the server which needs to mapped to the message + """ max_number_of_retries = 100 for i in range(max_number_of_retries): - await asyncio.sleep(0.1) + await asyncio.sleep(0.025) try: - event = self.message_to_event_map[message_id] # type: ignore + event = self.events_map[message_id] except KeyError: if message_id in self.cancelled_messages: # Only for safety, likely should never reach here @@ -845,105 +1835,237 @@ async def resolve_reply(self, message_id, return_value): return if i >= max_number_of_retries - 1: self.logger.error("unknown message id {} without corresponding event object".format(message_id)) - print(return_value) return else: - self.shared_message_map[message_id] = return_value + self.message_map[message_id] = data event.set() break - async def async_send_instruction(self, instance_name : str, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT) -> bytes: - message_id = await self.pool[instance_name].async_send_instruction(instruction, arguments, context) + def assert_client_ready(self, client : AsyncZMQClient): + if not client._handshake_event.is_set(): + raise ConnectionAbortedError(f"{client.instance_name} is currently not alive") + if not client.socket in self.poller._map: + raise ConnectionError("handshake complete, server is alive but client socket not yet ready to be polled." + + "Application using MessageMappedClientPool should register the socket manually for polling." + + "If using hololinked.server.HTTPServer, socket is waiting until HTTP Server updates its " + "routing logic as the server has just now come alive, please try again soon.") + + async def async_send_instruction(self, instance_name : str, instruction : str, + arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, invokation_timeout : typing.Optional[float] = 3, + execution_timeout : typing.Optional[float] = None, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, argument_schema : typing.Optional[JSON] = None) -> bytes: + """ + Send instruction to server with instance name. Replies are automatically polled & to be retrieved using + ``async_recv_reply()`` + + Parameters + ---------- + instance_name: str + instance name of the server + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + + Returns + ------- + message_id: bytes + created message ID + """ + self.assert_client_ready(self.pool[instance_name]) + message_id = await self.pool[instance_name].async_send_instruction(instruction=instruction, arguments=arguments, + invokation_timeout=invokation_timeout, execution_timeout=execution_timeout, + context=context, argument_schema=argument_schema) event = self.event_pool.pop() - self.message_to_event_map[message_id] = event + self.events_map[message_id] = event return message_id - async def async_recv_reply(self, message_id : bytes, plain_reply : bool = False, raise_client_side_exception = False, - timeout : typing.Optional[int] = 3) -> Dict[str, Any]: - event = self.message_to_event_map[message_id] + async def async_recv_reply(self, instance_name : str, message_id : bytes, raise_client_side_exception = False, + timeout : typing.Optional[float] = None) -> typing.Dict[str, typing.Any]: + """ + Receive reply for specified message ID. + + Parameters + ---------- + message_id: bytes + the message id for which reply needs to eb fetched + raise_client_side_exceptions: bool, default False + raise exceptions from server on client side + timeout: float, + client side timeout, not the same as timeout passed to server, recommended to be None in general cases. + Server side timeouts ensure start of execution of instructions within specified timeouts and + drops execution altogether if timeout occured. Client side timeouts only wait for message to come within + the timeout, but do not gaurantee non-execution. + + Returns + ------- + reply: dict, Any + dictionary when plain reply is False, any value returned from execution on the server side if plain reply is + True. + + Raises + ------ + ValueError: + if supplied message id is not valid + TimeoutError: + if timeout is not None and reply did not arrive + """ try: - await asyncio.wait_for(event.wait(), timeout if (timeout and timeout > 0) else None) - except TimeoutError: - self.cancelled_messages.append(message_id) - self.logger.debug(f'message_id {message_id} added to list of cancelled messages') - else: - self.message_to_event_map.pop(message_id) - reply = self.shared_message_map.pop(message_id) - self.event_pool.completed(event) - if not plain_reply and reply.get('exception', None) is not None and raise_client_side_exception: - exc_info = reply['exception'] - raise Exception("traceback : {},\nmessage : {},\ntype : {}".format ( - '\n'.join(exc_info["traceback"]), exc_info['message'], exc_info["type"])) - return reply - raise TimeoutError(f"Execution not completed within {timeout} seconds") - - async def async_execute(self, instance_name : str, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT, raise_client_side_exception = False, - timeout : typing.Optional[int] = 3) -> Dict[str, Any]: - message_id = await self.async_send_instruction(instance_name, instruction, arguments, context) - return await self.async_recv_reply(message_id, context.get('plain_reply', False), raise_client_side_exception, - timeout) + event = self.events_map[message_id] + except KeyError: + raise ValueError(f"message id {message_id} unknown.") from None + while True: + try: + await asyncio.wait_for(event.wait(), timeout if (timeout and timeout > 0) else 5) + # default 5 seconds because we want to check if server is also dead + if event.is_set(): + break + self.assert_client_ready(self.pool[instance_name]) + except TimeoutError: + if timeout is None: + continue + self.cancelled_messages.append(message_id) + self.logger.debug(f'message_id {message_id} added to list of cancelled messages') + raise TimeoutError(f"Execution not completed within {timeout} seconds") from None + self.events_map.pop(message_id) + self.event_pool.completed(event) + reply = self.message_map.pop(message_id) + if raise_client_side_exception and reply.get('exception', None) is not None: + self.raise_local_exception(reply['exception']) + return reply + + async def async_execute(self, instance_name : str, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + *, context : typing.Dict[str, typing.Any] = EMPTY_DICT, raise_client_side_exception = False, + invokation_timeout : typing.Optional[float] = 5, execution_timeout : typing.Optional[float] = None, + argument_schema : typing.Optional[JSON] = None) -> typing.Dict[str, typing.Any]: + """ + sends message and receives reply. + + Parameters + ---------- + instance_name: str + instance name of the server + instruction: str + unique str identifying a server side or ``Thing`` resource. These values corresponding + to automatically extracted name from the object name or the URL_path prepended with the instance name. + arguments: Dict[str, Any] + if the instruction invokes a method, arguments of that method. + context: Dict[str, Any] + see execution context definitions + raise_client_side_exceptions: bool, default False + raise exceptions from server on client side + invokation_timeout: float, default 5 + server side timeout + execution_timeout: float, default None + client side timeout, not the same as timeout passed to server, recommended to be None in general cases. + Server side timeouts ensure start of execution of instructions within specified timeouts and + drops execution altogether if timeout occured. Client side timeouts only wait for message to come within + the timeout, but do not gaurantee non-execution. + """ + message_id = await self.async_send_instruction(instance_name=instance_name, instruction=instruction, + arguments=arguments, invokation_timeout=invokation_timeout, + execution_timeout=execution_timeout, context=context, + argument_schema=argument_schema) + return await self.async_recv_reply(instance_name=instance_name, message_id=message_id, + raise_client_side_exception=raise_client_side_exception, timeout=None) def start_polling(self) -> None: + """ + register the server message polling loop in the asyncio event loop. + """ event_loop = asyncio.get_event_loop() event_loop.call_soon(lambda: asyncio.create_task(self.poll())) async def stop_polling(self): + """ + stop polling for replies from server + """ self.stop_poll = True - async def ping_all_servers(self): - replies : List[Dict[str, Any]] = await asyncio.gather(*[ - self.async_execute(instance_name, '/ping') for instance_name in self.pool.keys()]) - sorted_reply = dict() - for reply, instance_name in zip(replies, self.pool.keys()): - sorted_reply[instance_name] = reply.get("returnValue", False if reply.get("exception", None) is None else True) # type: ignore - return sorted_reply - - def organised_gathered_replies(self, instance_names : List[str], gathered_replies : List[Any], context : Dict[str, Any] = EMPTY_DICT): - """ - First thing tomorrow - """ - plain_reply = context.pop('plain_reply', False) - if not plain_reply: - replies = dict( - returnValue = dict(), - state = dict() - ) - for instance_name, reply in zip(instance_names, gathered_replies): - replies["state"].update(reply["state"]) - replies["returnValue"][instance_name] = reply.get("returnValue", reply.get("exception", None)) - else: - replies = {} - for instance_name, reply in zip(instance_names, gathered_replies): - replies[instance_name] = reply + async def async_execute_in_all(self, instruction : str, instance_names : typing.Optional[typing.List[str]] = None, + arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, context : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = 5, execution_timeout : typing.Optional[float] = None, + ) -> typing.Dict[str, typing.Any]: + """ + execute a specified instruction in all Thing including eventloops + """ + if not instance_names: + instance_names = self.pool.keys() + gathered_replies = await asyncio.gather(*[ + self.async_execute(instance_name, instruction, arguments, + context=context, raise_client_side_exception=False, + invokation_timeout=invokation_timeout, execution_timeout=execution_timeout) + for instance_name in instance_names]) + replies = dict() + for instance_name, reply in zip(instance_names, gathered_replies): + replies[instance_name] = reply return replies + + async def async_execute_in_all_things(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = 5, execution_timeout : typing.Optional[float] = None, + ) -> typing.Dict[str, typing.Any]: # raise_client_side_exception = False + """ + execute the same instruction in all Things, eventloops are excluded. + """ + return await self.async_execute_in_all(instruction=instruction, + instance_names=[instance_name for instance_name, client in self.pool.items() if client.server_type == ServerTypes.THING], + arguments=arguments, context=context, invokation_timeout=invokation_timeout, + execution_timeout=execution_timeout) + + async def async_execute_in_all_eventloops(self, instruction : str, arguments : typing.Dict[str, typing.Any] = EMPTY_DICT, + context : typing.Dict[str, typing.Any] = EMPTY_DICT, + invokation_timeout : typing.Optional[float] = 5, execution_timeout : typing.Optional[float] = None, + ) -> typing.Dict[str, typing.Any]: # raise_client_side_exception = False + """ + execute the same instruction in all eventloops. + """ + return await self.async_execute_in_all(instruction=instruction, + instance_names=[instance_name for instance_name, client in self.pool.items() if client.server_type == ServerTypes.EVENTLOOP], + arguments=arguments, context=context, invokation_timeout=invokation_timeout, + execution_timeout=execution_timeout) - async def async_execute_in_all(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT, raise_client_side_exception = False) -> Dict[str, Any]: - instance_names = self.pool.keys() - gathered_replies = await asyncio.gather(*[ - self.async_execute(instance_name, instruction, arguments, context, raise_client_side_exception) for instance_name in instance_names]) - return self.organised_gathered_replies(instance_names, gathered_replies, context) + async def ping_all_servers(self): + """ + ping all servers connected to the client pool, calls ping() on Thing + """ + return await self.async_execute_in_all(instruction=CommonRPC.PING) + + def __contains__(self, name : str) -> bool: + return name in self.pool + + def __getitem__(self, key) ->AsyncZMQClient: + return self.pool[key] - async def async_execute_in_all_remote_objects(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT, raise_client_side_exception = False) -> Dict[str, Any]: - instance_names = [instance_name for instance_name, client in self.pool.items() if client.server_type == ServerTypes.USER_REMOTE_OBJECT] - gathered_replies = await asyncio.gather(*[ - self.async_execute(instance_name, instruction, arguments, context, raise_client_side_exception) for instance_name in instance_names]) - return self.organised_gathered_replies(instance_names, gathered_replies, context) + def __iter__(self) -> typing.Iterator[AsyncZMQClient]: + return iter(self.pool.values()) - async def async_execute_in_all_eventloops(self, instruction : str, arguments : Dict[str, Any] = EMPTY_DICT, - context : Dict[str, Any] = EMPTY_DICT, raise_client_side_exception = False) -> Dict[str, Any]: - instance_names = [instance_name for instance_name, client in self.pool.items() if client.server_type == ServerTypes.EVENTLOOP] - gathered_replies = await asyncio.gather(*[ - self.async_execute(instance_name, instruction, arguments, context, raise_client_side_exception) for instance_name in instance_names]) - return self.organised_gathered_replies(instance_names, gathered_replies, context) + def exit(self) -> None: + BaseZMQ.exit(self) + for client in self.pool.values(): + self.poller.unregister(client.socket) + self.poller.unregister(client.socket.get_monitor_socket()) + client.exit() + self.logger.info("all client socket unregistered from pool for '{}'".format(self.__class__)) + try: + self.context.term() + self.logger.info("context terminated for '{}'".format(self.__class__)) + except Exception as ex: + self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context" + + "'{}'. Exception message : {}".format(self.identity, str(ex))) -class EventPool: +class AsyncioEventPool: """ creates a pool of asyncio Events to be used as a synchronisation object for MessageMappedClientPool + + Parameters + ---------- + initial_number_of_events: int + initial pool size of events """ def __init__(self, initial_number_of_events : int) -> None: @@ -951,6 +2073,9 @@ def __init__(self, initial_number_of_events : int) -> None: self.size = initial_number_of_events def pop(self) -> asyncio.Event: + """ + pop an event, new one is created if nothing left in pool + """ try: event = self.pool.pop(0) except IndexError: @@ -960,107 +2085,78 @@ def pop(self) -> asyncio.Event: return event def completed(self, event : asyncio.Event) -> None: + """ + put an event back into the pool + """ self.pool.append(event) -class Event: - - def __init__(self, name : str, URL_path : typing.Optional[str] = None) -> None: - self.name = name - # self.name_bytes = bytes(name, encoding = 'utf-8') - self.URL_path = URL_path or '/' + name - self._full_URL_path_prefix = None - self._owner : typing.Optional[Parameterized] = None +class EventPublisher(BaseZMQServer, BaseSyncZMQ): - @property - def owner(self): - return self._owner - - @property - def full_URL_path_prefix(self): - return self._full_URL_path_prefix + def __init__(self, instance_name : str, protocol : str, + context : typing.Union[zmq.Context, None] = None, **kwargs) -> None: + super().__init__(instance_name=instance_name, server_type=ServerTypes.THING.value, + **kwargs) + self.create_socket(identity=f'{instance_name}/event-publisher', bind=True, context=context, + protocol=protocol, socket_type=zmq.PUB, **kwargs) + self.logger.info(f"created event publishing socket at {self.socket_address}") + self.events = set() # type: typing.Set[Event] + self.event_ids = set() # type: typing.Set[bytes] - @full_URL_path_prefix.setter - def full_URL_path_prefix(self, value : str): - self._full_URL_path_prefix = value - - @property - def publisher(self) -> "EventPublisher": - return self._publisher - - @publisher.setter - def publisher(self, value : "EventPublisher") -> None: - if not hasattr(self, '_publisher'): - self._publisher = value - self._event_unique_str = bytes(f"{self.full_URL_path_prefix}/event{self.URL_path}", encoding='utf-8') - self._publisher.register_event(self) - else: - raise AttributeError("cannot reassign publisher attribute of event {}".format(self.name)) - - def push(self, data : typing.Any = None, serialize : bool = True): - self.publisher.publish_event(self._event_unique_str, data, serialize) - - - -class CriticalEvent(Event): - - def __init__(self, name : str, message_broker : BaseZMQServer, acknowledgement_timeout : float = 3, - URL_path : typing.Optional[str] = None) -> None: - super().__init__(name, URL_path) - self.message_broker = message_broker - self.aknowledgement_timeout = acknowledgement_timeout - - def push(self, data : typing.Any = None): - super().push(data) - self.message_broker.receive_acknowledgement() - - - -class EventPublisher(BaseZMQServer): + def register(self, event : "Event") -> None: + """ + register event with a specific (unique) name - def __init__(self, identity : str, context : Union[zmq.Context, None] = None, **serializer) -> None: - super().__init__(server_type = ServerTypes.UNKNOWN_TYPE, **serializer) - self.context = context or zmq.Context() - self.identity = identity - self.socket = self.context.socket(zmq.PUB) - for i in range(1000, 65535): - try: - self.socket_address = "tcp://127.0.0.1:{}".format(i) - self.socket.bind(self.socket_address) - except zmq.error.ZMQError as E: - if E.strerror.startswith('Address in use'): - pass - else: - print("Following error while atttempting to bind to socket address : {}".format(self.socket_address)) - raise - else: - self.logger = self.get_logger(identity, self.socket_address, logging.DEBUG, self.__class__.__name__) - self.logger.info("created event publishing socket at {}".format(self.socket_address)) - break - self.events : Set[Event] = set() - self.event_ids : Set[bytes] = set() - - def register_event(self, event : Event) -> None: - # unique_str_bytes = bytes(unique_str, encoding = 'utf-8') - if event._event_unique_str in self.events: - raise AttributeError(wrap_text( - """event {} already found in list of events, please use another name. - Also, Remotesubobject and RemoteObject cannot share event names.""".format(event.name)) - ) - self.event_ids.add(event._event_unique_str) + Parameters + ---------- + event: ``Event`` + ``Event`` object that needs to be registered. Events created at ``__init__()`` of Thing are + automatically registered. + """ + if event._unique_identifier in self.events and event not in self.events: + raise AttributeError(f"event {event.name} already found in list of events, please use another name.") + self.event_ids.add(event._unique_identifier) self.events.add(event) self.logger.info("registered event '{}' serving at PUB socket with address : {}".format(event.name, self.socket_address)) - def publish_event(self, unique_str : bytes, data : Any, serialize : bool = True) -> None: - if unique_str in self.event_ids: - self.socket.send_multipart([unique_str, self.json_serializer.dumps(data) if serialize else data]) + def publish(self, unique_identifier : bytes, data : typing.Any, *, rpc_clients : bool = True, + http_clients : bool = True, serialize : bool = True) -> None: + """ + publish an event with given unique name. + + Parameters + ---------- + unique_identifier: bytes + unique identifier of the event + data: Any + payload of the event + serialize: bool, default True + serialize the payload before pushing, set to False when supplying raw bytes + rpc_clients: bool, default True + pushes event to RPC clients + http_clients: bool, default True + pushed event to HTTP clients + """ + if unique_identifier in self.event_ids: + if serialize: + if isinstance(self.rpc_serializer , JSONSerializer): + self.socket.send_multipart([unique_identifier, self.json_serializer.dumps(data)]) + return + if rpc_clients: + self.socket.send_multipart([unique_identifier, self.rpc_serializer.dumps(data)]) + if http_clients: + self.socket.send_multipart([unique_identifier, self.json_serializer.dumps(data)]) + else: + self.socket.send_multipart([unique_identifier, data]) else: - raise AttributeError("event name {} not yet registered with socket {}".format(unique_str, self.socket_address)) + raise AttributeError("event name {} not yet registered with socket {}".format(unique_identifier, self.socket_address)) def exit(self): + if not hasattr(self, 'logger'): + self.logger = get_default_logger('{}|{}'.format(self.__class__.__name__, uuid4())) try: - self.socket.close() + self.socket.close(0) self.logger.info("terminated event publishing socket with address '{}'".format(self.socket_address)) except Exception as E: self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context at address '{}'. Exception message : {}".format( @@ -1073,40 +2169,252 @@ def exit(self): self.socket_address, str(E))) -class EventConsumer(BaseZMQClient): - def __init__(self, event_URL : str, socket_address : str, identity : str, client_type = HTTP_SERVER, **kwargs) -> None: - super().__init__(server_address=b'unknown', server_instance_name='unkown', client_type=client_type, **kwargs) - # Prepare our context and publisher - self.event_URL = bytes(event_URL, encoding = 'utf-8') - self.socket_address = socket_address - self.identity = identity - self.context = zmq.asyncio.Context() - self.socket = self.context.socket(zmq.SUB) - self.socket.connect(socket_address) - self.socket.setsockopt(zmq.SUBSCRIBE, self.event_URL) - self.logger = self.get_logger(identity, self.socket_address, logging.DEBUG, self.__class__.__name__) - self.logger.info("connected event consuming socket to address {}".format(self.socket_address)) - - async def receive_event(self, deserialize = False): - _, contents = await self.socket.recv_multipart() - if not deserialize: - return contents - return self.json_serializer.loads(contents) - +class BaseEventConsumer(BaseZMQClient): + """ + Consumes events published at PUB sockets using SUB socket. + + Parameters + ---------- + unique_identifier: str + identifier of the event registered at the PUB socket + socket_address: str + socket address of the event publisher (``EventPublisher``) + identity: str + unique identity for the consumer + client_type: bytes + b'HTTP_SERVER' or b'PROXY' + **kwargs: + protocol: str + TCP, IPC or INPROC + json_serializer: JSONSerializer + json serializer instance for HTTP_SERVER client type + rpc_serializer: BaseSerializer + serializer for RPC clients + server_instance_name: str + instance name of the Thing publishing the event + """ + + def __init__(self, unique_identifier : str, socket_address : str, + identity : str, client_type = b'HTTP_SERVER', + **kwargs) -> None: + self._terminate_context : bool + protocol = socket_address.split('://', 1)[0].upper() + super().__init__(server_instance_name=kwargs.get('server_instance_name', None), + client_type=client_type, **kwargs) + self.create_socket(identity=identity, bind=False, context=self.context, + socket_address=socket_address, socket_type=zmq.SUB, protocol=protocol, **kwargs) + self.unique_identifier = bytes(unique_identifier, encoding='utf-8') + self.socket.setsockopt(zmq.SUBSCRIBE, self.unique_identifier) + # pair sockets cannot be polled unforunately, so we use router + self.interruptor = self.context.socket(zmq.PAIR) + self.interruptor.setsockopt_string(zmq.IDENTITY, f'interrupting-server') + self.interruptor.bind(f'inproc://{self.identity}/interruption') + self.interrupting_peer = self.context.socket(zmq.PAIR) + self.interrupting_peer.setsockopt_string(zmq.IDENTITY, f'interrupting-client') + self.interrupting_peer.connect(f'inproc://{self.identity}/interruption') + self.poller.register(self.socket, zmq.POLLIN) + self.poller.register(self.interruptor, zmq.POLLIN) + + def exit(self): + if not hasattr(self, 'logger'): + self.logger = get_default_logger('{}|{}'.format(self.__class__.__name__, uuid4())) try: - self.socket.close() - self.logger.info("terminated event consuming socket with address '{}'".format(self.socket_address)) + self.poller.unregister(self.socket) + self.poller.unregister(self.interruptor) except Exception as E: - self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context at address '{}'. Exception message : {}".format( + self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated socket of event consuming socket at address '{}'. Exception message : {}".format( self.socket_address, str(E))) + try: - self.context.term() - self.logger.info("terminated context of event consuming socket with address '{}'".format(self.socket_address)) + self.socket.close(0) + self.interruptor.close(0) + self.interrupting_peer.close(0) + self.logger.info("terminated event consuming socket with address '{}'".format(self.socket_address)) + except: + self.logger.warn("could not terminate sockets") + + try: + if self._terminate_context: + self.context.term() + self.logger.info("terminated context of event consuming socket with address '{}'".format(self.socket_address)) except Exception as E: - self.logger.warn("could not properly terminate socket or attempted to terminate an already terminated socket of event consuming socket at address '{}'. Exception message : {}".format( + self.logger.warn("could not properly terminate context or attempted to terminate an already terminated context at address '{}'. Exception message : {}".format( self.socket_address, str(E))) -__all__ = ['ServerTypes', 'AsyncZMQServer', 'AsyncPollingZMQServer', 'ZMQServerPool', 'SyncZMQClient', - 'AsyncZMQClient', 'AsyncZMQClientPool', 'MessageMappedZMQClientPool', 'Event', 'CriticalEvent'] \ No newline at end of file + + +class AsyncEventConsumer(BaseEventConsumer): + """ + Listens to events published at PUB sockets using SUB socket, use in async loops. + + Parameters + ---------- + unique_identifier: str + identifier of the event registered at the PUB socket + socket_address: str + socket address of the event publisher (``EventPublisher``) + identity: str + unique identity for the consumer + client_type: bytes + b'HTTP_SERVER' or b'PROXY' + **kwargs: + protocol: str + TCP, IPC or INPROC + json_serializer: JSONSerializer + json serializer instance for HTTP_SERVER client type + rpc_serializer: BaseSerializer + serializer for RPC clients + server_instance_name: str + instance name of the Thing publishing the event + """ + + def __init__(self, unique_identifier : str, socket_address : str, identity : str, client_type = b'HTTP_SERVER', + context : typing.Optional[zmq.asyncio.Context] = None, **kwargs) -> None: + self._terminate_context = context == None + self.context = context or zmq.asyncio.Context() + self.poller = zmq.asyncio.Poller() + super().__init__(unique_identifier=unique_identifier, socket_address=socket_address, + identity=identity, client_type=client_type, **kwargs) + + async def receive(self, timeout : typing.Optional[float] = None, deserialize = True) -> typing.Union[bytes, typing.Any]: + """ + receive event with given timeout + + Parameters + ---------- + timeout: float, int, None + timeout in milliseconds, None for blocking + deserialize: bool, default True + deseriliaze the data, use False for HTTP server sent event to simply bypass + """ + contents = None + sockets = await self.poller.poll(timeout) + if len(sockets) > 1: + if socket[0] == self.interrupting_peer: + sockets = [socket[0]] + elif sockets[1] == self.interrupting_peer: + sockets = [socket[1]] + for socket, _ in sockets: + try: + _, contents = await socket.recv_multipart(zmq.NOBLOCK) + except zmq.Again: + pass + if not deserialize or not contents: + return contents + if self.client_type == HTTP_SERVER: + return self.json_serializer.loads(contents) + elif self.client_type == PROXY: + return self.rpc_serializer.loads(contents) + else: + raise ValueError("invalid client type") + + async def interrupt(self): + """ + interrupts the event consumer and returns a 'INTERRUPT' string from the receive() method, + generally should be used for exiting this object + """ + if self.client_type == HTTP_SERVER: + message = [self.json_serializer.dumps(f'{self.identity}/interrupting-server'), + self.json_serializer.dumps("INTERRUPT")] + elif self.client_type == PROXY: + message = [self.rpc_serializer.dumps(f'{self.identity}/interrupting-server'), + self.rpc_serializer.dumps("INTERRUPT")] + await self.interrupting_peer.send_multipart(message) + + +class EventConsumer(BaseEventConsumer): + """ + Listens to events published at PUB sockets using SUB socket, listen in blocking fashion or use in threads. + + Parameters + ---------- + unique_identifier: str + identifier of the event registered at the PUB socket + socket_address: str + socket address of the event publisher (``EventPublisher``) + identity: str + unique identity for the consumer + client_type: bytes + b'HTTP_SERVER' or b'PROXY' + **kwargs: + protocol: str + TCP, IPC or INPROC + json_serializer: JSONSerializer + json serializer instance for HTTP_SERVER client type + rpc_serializer: BaseSerializer + serializer for RPC clients + server_instance_name: str + instance name of the Thing publishing the event + """ + + def __init__(self, unique_identifier : str, socket_address : str, identity : str, client_type = b'HTTP_SERVER', + context : typing.Optional[zmq.Context] = None, **kwargs) -> None: + self._terminate_context = context == None + self.context = context or zmq.Context() + self.poller = zmq.Poller() + super().__init__(unique_identifier=unique_identifier, socket_address=socket_address, + identity=identity, client_type=client_type, **kwargs) + + def receive(self, timeout : typing.Optional[float] = None, deserialize = True) -> typing.Union[bytes, typing.Any]: + """ + receive event with given timeout + + Parameters + ---------- + timeout: float, int, None + timeout in milliseconds, None for blocking + deserialize: bool, default True + deseriliaze the data, use False for HTTP server sent event to simply bypass + """ + contents = None + sockets = self.poller.poll(timeout) # typing.List[typing.Tuple[zmq.Socket, int]] + if len(sockets) > 1: + if socket[0] == self.interrupting_peer: + sockets = [socket[0]] + elif sockets[1] == self.interrupting_peer: + sockets = [socket[1]] + for socket, _ in sockets: + try: + _, contents = socket.recv_multipart(zmq.NOBLOCK) + except zmq.Again: + pass + if not deserialize: + return contents + if self.client_type == HTTP_SERVER: + return self.json_serializer.loads(contents) + elif self.client_type == PROXY: + return self.rpc_serializer.loads(contents) + else: + raise ValueError("invalid client type for event") + + def interrupt(self): + """ + interrupts the event consumer and returns a 'INTERRUPT' string from the receive() method, + generally should be used for exiting this object + """ + if self.client_type == HTTP_SERVER: + message = [self.json_serializer.dumps(f'{self.identity}/interrupting-server'), + self.json_serializer.dumps("INTERRUPT")] + elif self.client_type == PROXY: + message = [self.rpc_serializer.dumps(f'{self.identity}/interrupting-server'), + self.rpc_serializer.dumps("INTERRUPT")] + self.interrupting_peer.send_multipart(message) + + + +from .events import Event + + +__all__ = [ + AsyncZMQServer.__name__, + AsyncPollingZMQServer.__name__, + ZMQServerPool.__name__, + RPCServer.__name__, + SyncZMQClient.__name__, + AsyncZMQClient.__name__, + MessageMappedZMQClientPool.__name__, + AsyncEventConsumer.__name__, + EventConsumer.__name__ +] \ No newline at end of file diff --git a/hololinked/system_host/__init__.py b/hololinked/system_host/__init__.py new file mode 100644 index 0000000..134ebb0 --- /dev/null +++ b/hololinked/system_host/__init__.py @@ -0,0 +1 @@ +from .server import create_system_host \ No newline at end of file diff --git a/hololinked/system_host/assets/default_host_settings.json b/hololinked/system_host/assets/default_host_settings.json new file mode 100644 index 0000000..13a662f --- /dev/null +++ b/hololinked/system_host/assets/default_host_settings.json @@ -0,0 +1,32 @@ +{ + "dashboards" : { + "deleteWithoutAsking" : true, + "showRecentlyUsed" : true, + "use" : true + }, + "login" : { + "footer" : "", + "footerLink" : "", + "displayFooter" : true + }, + "servers" : { + "allowHTTP" : false + }, + "remoteObjectViewer" : { + "console" : { + "stringifyOutput" : false, + "defaultMaxEntries" : 15, + "defaultWindowSize" : 500, + "defaultFontSize" : 16 + }, + "logViewer" : { + "defaultMaxEntries" : 10, + "defaultWindowSize" : 500, + "defaultFontSize" : 16, + "defaultInterval" : 2 + } + }, + "others" : { + "WOTTerminology" : false + } +} \ No newline at end of file diff --git a/hololinked/system_host/assets/hololinked-server-swagger-api b/hololinked/system_host/assets/hololinked-server-swagger-api new file mode 160000 index 0000000..96e9aa8 --- /dev/null +++ b/hololinked/system_host/assets/hololinked-server-swagger-api @@ -0,0 +1 @@ +Subproject commit 96e9aa86a7f60df6ae5b8dbf7f9d049b64b6464d diff --git a/hololinked/system_host/assets/swagger_ui_template.html b/hololinked/system_host/assets/swagger_ui_template.html new file mode 100644 index 0000000..fb7dc11 --- /dev/null +++ b/hololinked/system_host/assets/swagger_ui_template.html @@ -0,0 +1,22 @@ + + + + Swagger UI + + + +
+ + + + diff --git a/hololinked/system_host/assets/system_host_api.yml b/hololinked/system_host/assets/system_host_api.yml new file mode 100644 index 0000000..b3e30a5 --- /dev/null +++ b/hololinked/system_host/assets/system_host_api.yml @@ -0,0 +1,12 @@ +openapi: '3.0.2' +info: + title: API Title + version: '1.0' +servers: + - url: https://api.server.test/v1 +paths: + /test: + get: + responses: + '200': + description: OK diff --git a/hololinked/system_host/handlers.py b/hololinked/system_host/handlers.py new file mode 100644 index 0000000..cc25293 --- /dev/null +++ b/hololinked/system_host/handlers.py @@ -0,0 +1,569 @@ +import os +import socket +import uuid +import typing +import copy +from typing import List +from argon2 import PasswordHasher + +from sqlalchemy import select, delete, update +from sqlalchemy.orm import Session +from sqlalchemy.ext import asyncio as asyncio_ext +from sqlalchemy.exc import SQLAlchemyError +from tornado.web import RequestHandler, HTTPError, authenticated + + +from .models import * +from ..server.serializers import JSONSerializer +from ..server.config import global_config +from ..server.utils import get_IP_from_interface + + +def for_authenticated_user(method): + async def authenticated_method(self : "SystemHostHandler", *args, **kwargs) -> None: + if self.current_user_valid: + return await method(self, *args, **kwargs) + self.set_status(403) + self.set_custom_default_headers() + self.finish() + return + return authenticated_method + + +class SystemHostHandler(RequestHandler): + """ + Base Request Handler for all requests directed to system host server. Implements CORS & credential checks. + Use built in swagger-ui for request handler documentation for other paths. + """ + + def initialize(self, CORS : typing.List[str], disk_session : Session, mem_session : asyncio_ext.AsyncSession) -> None: + self.CORS = CORS + self.disk_session = disk_session + self.mem_session = mem_session + + @property + def headers_ok(self): + """ + check suitable values for headers before processing the request + """ + content_type = self.request.headers.get("Content-Type", None) + if content_type and content_type != "application/json": + self.set_status(400, "request body is not JSON.") + self.finish() + return False + return True + + @property + def current_user_valid(self) -> bool: + """ + check if current user is a valid user for accessing authenticated resources + """ + user = self.get_signed_cookie('user', None) + if user is None: + return False + with self.mem_session() as session: + session : Session + stmt = select(UserSession).filter_by(session_key=user) + data = session.execute(stmt) + data = data.scalars().all() + if len(data) == 0: + return False + if len(data) > 1: + raise HTTPError("session ID not unique, internal logic error - contact developers (https://github.com/VigneshVSV/hololinked/issues)") + data = data[0] + if (data.session_key == user and data.origin == self.request.headers.get("Origin") and + data.user_agent == self.request.headers.get("User-Agent") and data.remote_IP == self.request.remote_ip): + return True + + def get_current_user(self) -> typing.Any: + """ + gets the current logged in user - call after ``current_user_valid`` + """ + return self.get_signed_cookie('user', None) + + def set_access_control_allow_origin(self) -> None: + """ + For credential login, access control allow origin cannot be '*', + See: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#examples_of_access_control_scenarios + """ + origin = self.request.headers.get("Origin") + if origin is not None and (origin in self.CORS or origin + '/' in self.CORS): + self.set_header("Access-Control-Allow-Origin", origin) + + def set_access_control_allow_headers(self) -> None: + """ + For credential login, access control allow headers cannot be '*'. + See: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#examples_of_access_control_scenarios + """ + headers = ", ".join(self.request.headers.keys()) + if self.request.headers.get("Access-Control-Request-Headers", None): + headers += ", " + self.request.headers["Access-Control-Request-Headers"] + self.set_header("Access-Control-Allow-Headers", headers) + + def set_access_control_allow_methods(self) -> None: + """ + sets methods allowed so that options method can be reused in all children + """ + self.set_header("Access-Control-Allow-Methods", "GET, POST, OPTIONS") + + def set_custom_default_headers(self) -> None: + """ + sets access control allow origin, allow headers and allow credentials + """ + self.set_access_control_allow_origin() + self.set_access_control_allow_headers() + self.set_header("Access-Control-Allow-Credentials", "true") + + async def options(self): + self.set_status(204) + self.set_access_control_allow_methods() + self.set_custom_default_headers() + self.finish() + + +class UsersHandler(SystemHostHandler): + + async def post(self): + self.set_status(200) + self.finish() + + async def get(self): + self.set_status(200) + self.finish() + + +class LoginHandler(SystemHostHandler): + """ + performs login and supplies a signed cookie for session + """ + async def post(self): + if not self.headers_ok: + return + try: + body = JSONSerializer.generic_loads(self.request.body) + email = body["email"] + password = body["password"] + rememberme = body["rememberme"] + async with self.disk_session() as session: + session : asyncio_ext.AsyncSession + stmt = select(LoginCredentials).filter_by(email=email) + data = await session.execute(stmt) + data = data.scalars().all() # type: typing.List[LoginCredentials] + if len(data) == 0: + self.set_status(404, "authentication failed - username not found") + else: + data = data[0] # type: LoginCredentials + ph = PasswordHasher(time_cost=global_config.PWD_HASHER_TIME_COST) + if ph.verify(data.password, password): + self.set_status(204, "logged in") + cookie_value = bytes(str(uuid.uuid4()), encoding = 'utf-8') + self.set_signed_cookie("user", cookie_value, httponly=True, + secure=True, samesite="strict", + expires_days=30 if rememberme else None) + with self.mem_session() as session: + session : Session + session.add(UserSession(email=email, session_key=cookie_value, + origin=self.request.headers.get("Origin"), + user_agent=self.request.headers.get("User-Agent"), + remote_IP=self.request.remote_ip + ) + ) + session.commit() + except Exception as ex: + ex_str = str(ex) + if ex_str.startswith("password does not match"): + ex_str = "username or password not correct" + self.set_status(500, f"authentication failed - {ex_str}") + self.set_custom_default_headers() + self.finish() + + def set_access_control_allow_methods(self) -> None: + self.set_header("Access-Control-Allow-Methods", "GET, POST, OPTIONS") + + +class LogoutHandler(SystemHostHandler): + """ + Performs logout and clears the signed cookie of session + """ + + @for_authenticated_user + async def post(self): + if not self.headers_ok: + return + try: + if not self.current_user_valid: + self.set_status(409, "not a valid user to logout") + else: + user = self.get_current_user() + with self.mem_session() as session: + session : Session + stmt = delete(UserSession).filter_by(session_key=user) + result = session.execute(stmt) + if result.rowcount != 1: + self.set_status(500, "found user but could not logout") # never comes here + session.commit() + self.set_status(204, "logged out") + self.clear_cookie("user") + except Exception as ex: + self.set_status(500, f"logout failed - {str(ex)}") + self.set_custom_default_headers() + self.finish() + + def set_access_control_allow_methods(self) -> None: + self.set_header("Access-Control-Allow-Methods", "POST, OPTIONS") + + +class WhoAmIHandler(SystemHostHandler): + + @for_authenticated_user + async def get(self): + + with self.mem_session() as session: + session : Session + stmt = select(UserSession).filter_by(session_key=user) + data = session.execute(stmt) + data = data.scalars().all() + + user = self.get_current_user() + with self.mem_session() as session: + session : Session + stmt = delete(UserSession).filter_by(session_key=user) + result = session.execute(stmt) + if result.rowcount != 1: + self.set_status(500, "found user but could not logout") # never comes here + session.commit() + self.set_status(204, "logged out") + self.clear_cookie("user") + + + +class AppSettingsHandler(SystemHostHandler): + + @for_authenticated_user + async def get(self, field : typing.Optional[str] = None): + if not self.headers_ok: + return + try: + async with self.disk_session() as session: + session : asyncio_ext.AsyncSession + stmt = select(AppSettings) + data = await session.execute(stmt) + serialized_data = JSONSerializer.generic_dumps({ + result[AppSettings.__name__].field : result[AppSettings.__name__].value + for result in data.mappings().all()}) + self.set_status(200) + self.set_header("Content-Type", "application/json") + self.write(serialized_data) + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def options(self, name : typing.Optional[str] = None): + self.set_status(204) + self.set_header("Access-Control-Allow-Methods", "GET, OPTIONS") + self.set_custom_default_headers() + self.finish() + + +class AppSettingHandler(SystemHostHandler): + + @for_authenticated_user + async def post(self): + if not self.headers_ok: + return + try: + value = JSONSerializer.generic_loads(self.request.body["value"]) + async with self.disk_session() as session: + session : asyncio_ext.AsyncSession + session.add(AppSettings( + field = field, + value = {"value" : value} + )) + await session.commit() + self.set_status(200) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + @for_authenticated_user + async def patch(self, field : str): + if not self.headers_ok: + return + try: + value = JSONSerializer.generic_loads(self.request.body) + if field == 'remote-object-viewer': + field = 'remoteObjectViewer' + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.AsyncSession + stmt = select(AppSettings).filter_by(field=field) + data = await session.execute(stmt) + setting : AppSettings = data.scalar() + new_value = copy.deepcopy(setting.value) + self.deepupdate_dict(new_value, value) + setting.value = new_value + await session.commit() + self.set_status(200) + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def options(self, name : typing.Optional[str] = None): + self.set_status(204) + self.set_header("Access-Control-Allow-Methods", "POST, PATCH, OPTIONS") + self.set_custom_default_headers() + self.finish() + + def deepupdate_dict(self, d : dict, u : dict): + for k, v in u.items(): + if isinstance(v, dict): + d[k] = self.deepupdate_dict(d.get(k, {}), v) + else: + d[k] = v + return d + + +class PagesHandler(SystemHostHandler): + """ + get all pages - endpoint /pages + """ + + @for_authenticated_user + async def get(self): + if not self.headers_ok: + return + try: + async with self.disk_session() as session: + session : asyncio_ext.AsyncSession + stmt = select(Pages) + data = await session.execute(stmt) + serialized_data = JSONSerializer.generic_dumps([result[Pages.__name__].json() for result + in data.mappings().all()]) + self.set_status(200) + self.set_header("Content-Type", "application/json") + self.write(serialized_data) + except SQLAlchemyError as ex: + self.set_status(500, "database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + +class PageHandler(SystemHostHandler): + """ + add or edit a single page. endpoint - /pages/{name} + """ + + @for_authenticated_user + async def post(self, name): + if not self.headers_ok: + return + try: + data = JSONSerializer.generic_loads(self.request.body) + # name = self.request.arguments + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.AsyncSession + session.add(Pages(name=name, **data)) + await session.commit() + self.set_status(201) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + @for_authenticated_user + async def put(self, name): + if not self.headers_ok: + return + try: + updated_data = JSONSerializer.generic_loads(self.request.body) + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.AsyncSession + stmt = select(Pages).filter_by(name=name) + page = (await session.execute(stmt)).mappings().all() + if(len(page) == 0): + self.set_status(404, f"no such page with given name {name}") + else: + existing_data = page[0][Pages.__name__] # type: Pages + if updated_data.get("description", None): + existing_data.description = updated_data["description"] + if updated_data.get("URL", None): + existing_data.URL = updated_data["URL"] + await session.commit() + self.set_status(204) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + + @for_authenticated_user + async def delete(self, name): + if not self.headers_ok: + return + try: + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.AsyncSession + stmt = delete(Pages).filter_by(name=name) + ret = await session.execute(stmt) + await session.commit() + self.set_status(204) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def options(self, name): + print("name is ", name) + self.set_status(204) + self.set_header("Access-Control-Allow-Methods", "POST, PUT, DELETE, OPTIONS") + self.set_custom_default_headers() + self.finish() + + +class SubscribersHandler(SystemHostHandler): + + async def post(self): + if not self.headers_ok: + return + try: + server = Server(**JSONSerializer.generic_loads(self.request.body)) + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.AsyncSession + session.add(server) + await session.commit() + self.set_status(201) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def get(self): + if not self.headers_ok: + return + try: + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.session + stmt = select(Server) + result = await session.execute(stmt) + serialized_data = JSONSerializer.generic_dumps([val[Server.__name__].json() for val in result.mappings().all()]) + self.set_status(200) + self.set_header("Content-Type", "application/json") + self.write(serialized_data) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def put(self): + if not self.headers_ok: + return + try: + server = JSONSerializable.loads(self.request.body) + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.session + stmt = select(Server).filter_by(name=server.get("name", None)) + result = (await session.execute(stmt)).mappings().all() + if len(result) == 0: + self.set_status(404) + else: + result = result[0][Server.__name__] # type: Server + await session.commit() + self.set_status(204) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + async def delete(self): + if not self.headers_ok: + return + try: + server = JSONSerializable.loads(self.request.body) + async with self.disk_session() as session, session.begin(): + session : asyncio_ext.session + stmt = delete(Server).filter_by(name=server.get("name", None)) + ret = await session.execute(stmt) + self.set_status(204) + except SQLAlchemyError as ex: + self.set_status(500, "Database error - check message on server") + except Exception as ex: + self.set_status(500, str(ex)) + self.set_custom_default_headers() + self.finish() + + def set_access_control_allow_methods(self) -> None: + self.set_header("access-control-allow-methods", "GET, POST, PUT, DELETE, OPTIONS") + + +class SubscriberHandler(SystemHostHandler): + + async def get(self): + pass + + +class SwaggerUIHandler(SystemHostHandler): + + async def get(self): + await self.render( + f"{os.path.dirname(os.path.abspath(__file__))}{os.sep}assets{os.sep}swagger_ui_template.html", + swagger_spec_url="/index.yml" + ) + + +class MainHandler(SystemHostHandler): + + def initialize(self, CORS: List[str], disk_session: Session, + mem_session: asyncio_ext.AsyncSession, swagger : bool = False, + IP : str = "") -> None: + self.swagger = swagger + self.IP = IP + return super().initialize(CORS, disk_session, mem_session) + + async def get(self): + if not self.headers_ok: + return + self.set_status(200) + self.set_custom_default_headers() + self.write("

I am alive!

") + if self.swagger: + self.write(f"

Visit here to login and use my swagger doc

") + self.finish() + + + + + + + +__all__ = [ + SystemHostHandler.__name__, + UsersHandler.__name__, + AppSettingsHandler.__name__, + AppSettingHandler.__name__, + LoginHandler.__name__, + LogoutHandler.__name__, + WhoAmIHandler.__name__, + PagesHandler.__name__, + PageHandler.__name__, + SubscribersHandler.__name__, + SwaggerUIHandler.__name__, + MainHandler.__name__ +] \ No newline at end of file diff --git a/hololinked/system_host/models.py b/hololinked/system_host/models.py new file mode 100644 index 0000000..14d7948 --- /dev/null +++ b/hololinked/system_host/models.py @@ -0,0 +1,86 @@ +import typing +from dataclasses import asdict, field + +from sqlalchemy import Integer, String, JSON, ARRAY, Boolean, BLOB +from sqlalchemy.orm import Mapped, mapped_column, DeclarativeBase, MappedAsDataclass + +from ..server.constants import JSONSerializable + + +class HololinkedHostTableBase(DeclarativeBase): + pass + +class Pages(HololinkedHostTableBase, MappedAsDataclass): + __tablename__ = "pages" + + name : Mapped[str] = mapped_column(String(1024), primary_key=True, nullable=False) + URL : Mapped[str] = mapped_column(String(1024), unique=True, nullable=False) + description : Mapped[str] = mapped_column(String(16384)) + json_specfication : Mapped[typing.Dict[str, typing.Any]] = mapped_column(JSON, nullable=True) + + def json(self): + return { + "name" : self.name, + "URL" : self.URL, + "description" : self.description, + "json_specification" : self.json_specfication + } + +class AppSettings(HololinkedHostTableBase, MappedAsDataclass): + __tablename__ = "appsettings" + + field : Mapped[str] = mapped_column(String(8192), primary_key=True) + value : Mapped[typing.Dict[str, typing.Any]] = mapped_column(JSON) + + def json(self): + return asdict(self) + +class LoginCredentials(HololinkedHostTableBase, MappedAsDataclass): + __tablename__ = "login_credentials" + + email : Mapped[str] = mapped_column(String(1024), primary_key=True) + password : Mapped[str] = mapped_column(String(1024), unique=True) + +class Server(HololinkedHostTableBase, MappedAsDataclass): + __tablename__ = "http_servers" + + hostname : Mapped[str] = mapped_column(String, primary_key=True) + type : Mapped[str] = mapped_column(String) + port : Mapped[int] = mapped_column(Integer) + IPAddress : Mapped[str] = mapped_column(String) + https : Mapped[bool] = mapped_column(Boolean) + + def json(self): + return { + "hostname" : self.hostname, + "type" : self.type, + "port" : self.port, + "IPAddress" : self.IPAddress, + "https" : self.https + } + + + +class HololinkedHostInMemoryTableBase(DeclarativeBase): + pass + +class UserSession(HololinkedHostInMemoryTableBase, MappedAsDataclass): + __tablename__ = "user_sessions" + + email : Mapped[str] = mapped_column(String) + session_key : Mapped[BLOB] = mapped_column(BLOB, primary_key=True) + origin : Mapped[str] = mapped_column(String) + user_agent : Mapped[str] = mapped_column(String) + remote_IP : Mapped[str] = mapped_column(String) + + + +__all__ = [ + HololinkedHostTableBase.__name__, + HololinkedHostInMemoryTableBase.__name__, + Pages.__name__, + AppSettings.__name__, + LoginCredentials.__name__, + Server.__name__, + UserSession.__name__ +] \ No newline at end of file diff --git a/hololinked/system_host/server.py b/hololinked/system_host/server.py new file mode 100644 index 0000000..f229abb --- /dev/null +++ b/hololinked/system_host/server.py @@ -0,0 +1,146 @@ +import secrets +import os +import base64 +import socket +import json +import asyncio +import ssl +import typing +import getpass +from argon2 import PasswordHasher + +from sqlalchemy import create_engine +from sqlalchemy.orm import Session, sessionmaker +from sqlalchemy.ext import asyncio as asyncio_ext +from sqlalchemy_utils import database_exists, create_database, drop_database +from tornado.web import Application, StaticFileHandler, RequestHandler +from tornado.httpserver import HTTPServer as TornadoHTTP1Server +from tornado import ioloop + +from ..server.serializers import JSONSerializer +from ..server.database import BaseDB +from ..server.config import global_config +from .models import * +from .handlers import * + + +def create_system_host(db_config_file : typing.Optional[str] = None, ssl_context : typing.Optional[ssl.SSLContext] = None, + handlers : typing.List[typing.Tuple[str, RequestHandler, dict]] = [], **server_settings) -> TornadoHTTP1Server: + """ + global function for creating system hosting server using a database configuration file, SSL context & certain + server settings. Currently supports only one server per process due to usage of some global variables. + """ + disk_DB_URL = BaseDB.create_postgres_URL(db_config_file, database='hololinked-host', use_dialect=False) + if not database_exists(disk_DB_URL): + try: + create_database(disk_DB_URL) + sync_disk_db_engine = create_engine(disk_DB_URL) + HololinkedHostTableBase.metadata.create_all(sync_disk_db_engine) + create_tables(sync_disk_db_engine) + create_credentials(sync_disk_db_engine) + except Exception as ex: + if disk_DB_URL.startswith("sqlite"): + os.remove(disk_DB_URL.split('/')[-1]) + else: + drop_database(disk_DB_URL) + raise ex from None + finally: + sync_disk_db_engine.dispose() + + disk_DB_URL = BaseDB.create_postgres_URL(db_config_file, database='hololinked-host', use_dialect=True) + disk_engine = asyncio_ext.create_async_engine(disk_DB_URL, echo=True) + disk_session = sessionmaker(disk_engine, expire_on_commit=True, + class_=asyncio_ext.AsyncSession) # type: asyncio_ext.AsyncSession + + mem_DB_URL = BaseDB.create_sqlite_URL(in_memory=True) + mem_engine = create_engine(mem_DB_URL, echo=True) + mem_session = sessionmaker(mem_engine, expire_on_commit=True, + class_=Session) # type: Session + HololinkedHostInMemoryTableBase.metadata.create_all(mem_engine) + + CORS = server_settings.pop("CORS", []) + if not isinstance(CORS, (str, list)): + raise TypeError("CORS should be a list of strings or a string") + if isinstance(CORS, str): + CORS = [CORS] + kwargs = dict( + CORS=CORS, + disk_session=disk_session, + mem_session=mem_session + ) + + system_host_compatible_handlers = [] + for handler in handlers: + system_host_compatible_handlers.append((handler[0], handler[1], kwargs)) + + app = Application([ + (r"/", MainHandler, dict(IP="https://localhost:8080", swagger=True, **kwargs)), + (r"/users", UsersHandler, kwargs), + (r"/pages", PagesHandler, kwargs), + (r"/pages/(.*)", PageHandler, kwargs), + (r"/app-settings", AppSettingsHandler, kwargs), + (r"/app-settings/(.*)", AppSettingHandler, kwargs), + (r"/subscribers", SubscribersHandler, kwargs), + # (r"/remote-objects", RemoteObjectsHandler), + (r"/login", LoginHandler, kwargs), + (r"/logout", LogoutHandler, kwargs), + (r"/swagger-ui", SwaggerUIHandler, kwargs), + *system_host_compatible_handlers, + (r"/(.*)", StaticFileHandler, dict(path=os.path.join(os.path.dirname(__file__), + f"assets{os.sep}hololinked-server-swagger-api{os.sep}system-host-api")) + ), + ], + cookie_secret=base64.b64encode(os.urandom(32)).decode('utf-8'), + **server_settings) + + return TornadoHTTP1Server(app, ssl_options=ssl_context) + + +def start_tornado_server(server : TornadoHTTP1Server, port : int = 8080): + server.listen(port) + event_loop = ioloop.IOLoop.current() + print("starting server") + event_loop.start() + + +def create_tables(engine): + with Session(engine) as session, session.begin(): + file = open(f"{os.path.dirname(os.path.abspath(__file__))}{os.sep}assets{os.sep}default_host_settings.json", 'r') + default_settings = JSONSerializer.generic_load(file) + for name, settings in default_settings.items(): + session.add(AppSettings( + field = name, + value = settings + )) + session.commit() + + +def create_credentials(sync_engine): + """ + create name and password for a new user in a database + """ + + print("Requested primary host seems to use a new database. Give username and password (not for database server, but for client logins from hololinked-portal) : ") + email = input("email-id (not collected anywhere else excepted your own database) : ") + while True: + password = getpass.getpass("password : ") + password_confirm = getpass.getpass("repeat-password : ") + if password != password_confirm: + print("password & repeat password not the same. Try again.") + continue + with Session(sync_engine) as session, session.begin(): + ph = PasswordHasher(time_cost=global_config.PWD_HASHER_TIME_COST) + session.add(LoginCredentials(email=email, password=ph.hash(password))) + session.commit() + return + raise RuntimeError("password not created, aborting database creation.") + + +def delete_database(db_config_file): + # config_file = str(Path(os.path.dirname(__file__)).parent) + "\\assets\\db_config.json" + URL = BaseDB.create_URL(db_config_file, database="hololinked-host", use_dialect=False) + drop_database(URL) + + + +__all__ = ['create_system_host'] \ No newline at end of file diff --git a/hololinked/webdashboard/visualization_parameters.py b/hololinked/webdashboard/visualization_parameters.py new file mode 100644 index 0000000..53f76d4 --- /dev/null +++ b/hololinked/webdashboard/visualization_parameters.py @@ -0,0 +1,158 @@ +import os +import typing +from enum import Enum +from ..param.parameterized import Parameterized +from ..server.constants import USE_OBJECT_NAME, HTTP_METHODS +from ..server.remote_parameter import RemoteParameter +from ..server.events import Event + +try: + import plotly.graph_objects as go +except: + go = None + +class VisualizationParameter(RemoteParameter): + # type shield from RemoteParameter + pass + + + +class PlotlyFigure(VisualizationParameter): + + __slots__ = ['data_sources', 'update_event_name', 'refresh_interval', 'polled', + '_action_stub'] + + def __init__(self, default_figure, *, + data_sources : typing.Dict[str, typing.Union[RemoteParameter, typing.Any]], + polled : bool = False, refresh_interval : typing.Optional[int] = None, + update_event_name : typing.Optional[str] = None, doc: typing.Union[str, None] = None, + URL_path : str = USE_OBJECT_NAME) -> None: + super().__init__(default=default_figure, doc=doc, constant=True, readonly=True, URL_path=URL_path) + self.data_sources = data_sources + self.refresh_interval = refresh_interval + self.update_event_name = update_event_name + self.polled = polled + + def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: + if slot == 'owner' and self.owner is not None: + from ..webdashboard import RepeatedRequests, AxiosRequestConfig, EventSource + if self.polled: + if self.refresh_interval is None: + raise ValueError(f'for PlotlyFigure {self.name}, set refresh interval (ms) since its polled') + request = AxiosRequestConfig( + url=f'/parameters?{"&".join(f"{key}={value}" for key, value in self.data_sources.items())}', + # Here is where graphQL is very useful + method='get' + ) + self._action_stub = RepeatedRequests( + requests=request, + interval=self.refresh_interval, + ) + elif self.update_event_name: + if not isinstance(self.update_event_name, str): + raise ValueError(f'update_event_name for PlotlyFigure {self.name} must be a string') + request = EventSource(f'/event/{self.update_event_name}') + self._action_stub = request + else: + pass + + for field, source in self.data_sources.items(): + if isinstance(source, RemoteParameter): + if isinstance(source, EventSource): + raise RuntimeError("Parameter field not supported for event source, give str") + self.data_sources[field] = request.response[source.name] + elif isinstance(source, str): + if isinstance(source, RepeatedRequests) and source not in self.owner.parameters: # should be in remote parameters, not just parameter + raise ValueError(f'data_sources must be a string or RemoteParameter, type {type(source)} has been found') + self.data_sources[field] = request.response[source] + else: + raise ValueError(f'given source {source} invalid. Specify str for events or Parameter') + + return super()._post_slot_set(slot, old, value) + + def validate_and_adapt(self, value : typing.Any) -> typing.Any: + if self.allow_None and value is None: + return + if not go: + raise ImportError("plotly was not found/imported, install plotly to suport PlotlyFigure paramater") + if not isinstance(value, go.Figure): + raise TypeError(f"figure arguments accepts only plotly.graph_objects.Figure, not type {type(value)}", + self) + return value + + @classmethod + def serialize(cls, value): + return value.to_json() + + + +class Image(VisualizationParameter): + + __slots__ = ['event', 'streamable', '_action_stub', 'data_sources'] + + def __init__(self, default : typing.Any = None, *, streamable : bool = True, doc : typing.Optional[str] = None, + constant : bool = False, readonly : bool = False, allow_None : bool = False, + URL_path : str = USE_OBJECT_NAME, + http_method : typing.Tuple[typing.Optional[str], typing.Optional[str]] = (HTTP_METHODS.GET, HTTP_METHODS.PUT), + state : typing.Optional[typing.Union[typing.List, typing.Tuple, str, Enum]] = None, + db_persist : bool = False, db_init : bool = False, db_commit : bool = False, + class_member : bool = False, fget : typing.Optional[typing.Callable] = None, + fset : typing.Optional[typing.Callable] = None, fdel : typing.Optional[typing.Callable] = None, + deepcopy_default : bool = False, per_instance_descriptor : bool = False, + precedence : typing.Optional[float] = None) -> None: + super().__init__(default, doc=doc, constant=constant, readonly=readonly, allow_None=allow_None, + URL_path=URL_path, http_method=http_method, state=state, + db_persist=db_persist, db_init=db_init, db_commit=db_commit, class_member=class_member, + fget=fget, fset=fset, fdel=fdel, deepcopy_default=deepcopy_default, + per_instance_descriptor=per_instance_descriptor, precedence=precedence) + self.streamable = streamable + + def __set_name__(self, owner : typing.Any, attrib_name : str) -> None: + super().__set_name__(owner, attrib_name) + self.event = Event(attrib_name) + + def _post_value_set(self, obj : Parameterized, value : typing.Any) -> None: + super()._post_value_set(obj, value) + if value is not None: + print(f"pushing event {value[0:100]}") + self.event.push(value, serialize=False) + + def _post_slot_set(self, slot : str, old : typing.Any, value : typing.Any) -> None: + if slot == 'owner' and self.owner is not None: + from ..webdashboard import SSEVideoSource + request = SSEVideoSource(f'/event/image') + self._action_stub = request + self.data_sources = request.response + return super()._post_slot_set(slot, old, value) + + + + +class FileServer(RemoteParameter): + + __slots__ = ['directory'] + + def __init__(self, directory : str, *, doc : typing.Optional[str] = None, URL_path : str = USE_OBJECT_NAME, + class_member: bool = False, per_instance_descriptor: bool = False) -> None: + self.directory = self.validate_and_adapt_directory(directory) + super().__init__(default=self.load_files(self.directory), doc=doc, URL_path=URL_path, constant=True, + class_member=class_member, per_instance_descriptor=per_instance_descriptor) + + def validate_and_adapt_directory(self, value : str): + if not isinstance(value, str): + raise TypeError(f"FileServer parameter not a string, but type {type(value)}", self) + if not os.path.isdir(value): + raise ValueError(f"FileServer parameter directory '{value}' not a valid directory", self) + if not value.endswith('\\'): + value += '\\' + return value + + def load_files(self, directory : str): + return [f for f in os.listdir(directory) if os.path.isfile(os.path.join(directory, f))] + +class DocumentationFolder(FileServer): + + def __init__(self, directory : str, *, doc : typing.Optional[str] = None, URL_path : str = '/documentation', + class_member: bool = False, per_instance_descriptor: bool = False) -> None: + super().__init__(directory=directory, doc=doc, URL_path=URL_path, + class_member=class_member, per_instance_descriptor=per_instance_descriptor) \ No newline at end of file diff --git a/licenses/starlette-LICENSE.md b/licenses/starlette-LICENSE.md deleted file mode 100644 index d16a60e..0000000 --- a/licenses/starlette-LICENSE.md +++ /dev/null @@ -1,27 +0,0 @@ -Copyright © 2018, [Encode OSS Ltd](https://www.encode.io/). -All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions are met: - -* Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. - -* Redistributions in binary form must reproduce the above copyright notice, - this list of conditions and the following disclaimer in the documentation - and/or other materials provided with the distribution. - -* Neither the name of the copyright holder nor the names of its - contributors may be used to endorse or promote products derived from - this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" -AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE -FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL -DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR -SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER -CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, -OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 0000000..2c03b5e --- /dev/null +++ b/requirements.txt @@ -0,0 +1,8 @@ +argon2==0.1.10 +ifaddr==0.2.0 +msgspec==0.18.6 +pyzmq==25.1.0 +SQLAlchemy==2.0.21 +SQLAlchemy_Utils==0.41.1 +tornado==6.3.3 + diff --git a/setup.py b/setup.py index 4dbdbad..fff0b2d 100644 --- a/setup.py +++ b/setup.py @@ -1,24 +1,56 @@ import setuptools -long_description=""" -A zmq-based RPC tool-kit with built-in HTTP support for instrument control/data acquisition -or controlling generic python objects. -""" +# read the contents of your README file +from pathlib import Path +this_directory = Path(__file__).parent +long_description = (this_directory / "README.md").read_text() + setuptools.setup( name="hololinked", - version="0.1.0", + version="0.1.2", author="Vignesh Vaidyanathan", - author_email="vignesh.vaidyanathan@physik.uni-muenchen.de", - description="A zmq-based RPC tool-kit with built-in HTTP support for instrument control/data acquisition or controlling generic python objects.", + author_email="vignesh.vaidyanathan@hololinked.dev", + description="A ZMQ-based Object Oriented RPC tool-kit with HTTP support for instrument control/data acquisition or controlling generic python objects.", long_description=long_description, long_description_content_type="text/markdown", - url="", - packages=['hololinked'], + url="https://hololinked.readthedocs.io/en/latest/index.html", + packages=[ + 'hololinked', + 'hololinked.server', + 'hololinked.rpc', + 'hololinked.client', + 'hololinked.param' + ], classifiers=[ "Programming Language :: Python :: 3", + "License :: OSI Approved :: BSD License", "Operating System :: OS Independent", + "Intended Audience :: Information Technology", + "Intended Audience :: Science/Research", + "Intended Audience :: Healthcare Industry", + "Intended Audience :: Manufacturing", + "Intended Audience :: Developers", + "Intended Audience :: End Users/Desktop", + "Intended Audience :: Education", + "Topic :: Internet :: WWW/HTTP", + "Topic :: Internet :: WWW/HTTP :: Browsers", + "Topic :: Scientific/Engineering :: Human Machine Interfaces", + "Topic :: System :: Hardware", + "Development Status :: 4 - Beta" ], python_requires='>=3.7', + install_requires=[ + "argon2-cffi>=0.1.10", + "ifaddr>=0.2.0", + "msgspec>=0.18.6", + "pyzmq>=25.1.0", + "SQLAlchemy>=2.0.21", + "SQLAlchemy_Utils>=0.41.1", + "tornado>=6.3.3" + ], + license="BSD-3-Clause", + license_files=('license.txt', 'licenses/param-LICENSE.txt', 'licenses/pyro-LICENSE.txt'), + keywords=["data-acquisition", "zmq-rpc", "SCADA/IoT", "Web of Things"] ) \ No newline at end of file