Categories
Uncategorized

Enabling Hardware-In-The-Loop

2019/07/04

This post introduces the implementation of an HTTP communication interface into the Lightstage application, explains the new dependencies required and documents the initial API v0.1. The final section provides a How-To to get started.

The key merit to adding an HTTP communication interface is it allows us to do Hardware-In-The-Loop. That means we can connect low-level controllers to the IP address of the application, running the high-level control software and with that, deliver new balanced lighting effects and capture control sequences when we have access to high-level frame and light evaluation modelling.

The Example How-To gets us started serving baseline sets of intensities with an example of baseline intensities that improve the default balance for the AU (icosahedron 3v) light stage lighting position design.

Introducing the HTTP REST Interface

The CherryPy micro-web (service) framework provides us with the RESTful communication interface for client controllers to connect. The framework helps us keep the interface simple to integrate and simple to uncouple. This tutorial gives an idea of how to integrate.

The Web Service needs its own daemon process and is initialised via subprocess.Popen() on the local IP address, on port 8080 by default. Once started, a web browser tab is opened to the web service (http://127.0.0.1:8080) to browse and to verify API data responses.

Before the web service can be started, a data service needs to be running. The main application (and real-time visualisation) process’s runtime state data is transferred to the data service. The web service process connects to the initialised data service in order to respond to HTTP clients with that real-time state data.

The Redis NOSQL key-value database using the DockerHub redis image (95mb) is suitable provider. The key decision factors for choosing Redis include it is relatively simple, development is matured, it’s fast (in memory), handles concurrent access, has non-guaranteed persistent state, permits semi-structured data, allows connection via the Redis client API. The CherryPy authors also have a useful set of recipes for using Redis. The cost of choosing Redis is that the service requires a separate install and running process.

Therefore, to automate execution of Redis, we use the Docker containerisation system to integrate it as a third party tool. Docker’s official image for redis provides the software and environment to start a redis instance. The PyDocker client API lets us download, build and run the image. The running instance (in a docker container) gets assigned a locally registered network device IP address with which to connect to redis via the Redis Py client API. Using PyDocker client we can manage, monitor and repair errors in the docker container, such that it’s more or less seamless.

The Dockerfile that builds our docker image imports the DockerHub’s offical redis image, the image build file is located in /vendor/Redis/Simple/Dockerfile. The official image assures the redis instance retains ongoing support via testing, maintenance and provisioning of latest releases of redis.

Importantly, the main application provides to super-user-level control over the data service, which includes the initial data writes and start and shutdown of the redis instance (docker container). The web service expects user-level control over the data service, where it predominantly reads only.

A batch of test cases in the /test/ directory exercise docker image pull, build and remove, also docker container run, stop, remove, meta data extraction and redis set, get, flush and variants. These can be run as python /test/test_runner_db_services.py.

Added Dependencies

  1. New Python pip dependencies are installed via pip -r requirements.txt.
  2. Docker must be installed to run the hardware-in-the-loop communications interface (-m4).
  3. On first run of -m4, the official Redis image will be pulled from DockerHub (uncompressed 95mb), this can take 30 seconds or so.
    • This feature is in beta. Occasionally state dependent bugs deny access to the container’s meta data, and restarting the application will fix this. Most errors come with a handy workaround or recommendation. Please feel free to report any new bugs or offer improvements in the repo’s Github Issues pages.

LS Web Service API v0.1

The initial release of this feature delivers:

  • GET /baseline_intensity returns in text format, baseline intensity values paired with their LED index, from the section [LightOutput] in the default.properties configuration.
    • E.g. default light intensities: [[1,1.0],[2,1.0],..]
    • E.g. tuned light intensities: [[2,0.82],[5,0.93],..]
  • GET /config prints configuration state data, in html format. Shows command line option arguments and defaulted option data; and runtime configuration data, originally loaded from the default.properties configuration.
  • GET /status to report in text format, whether the web service is up.

Upcoming APIs will expand the entry-points to deliver real-time and sequential light intensity data for a specified frame.

Example How-To:

This section provides example how-to instructions to get the API and application running

  1. Install git, Python (e.g. Anaconda 2.7).
  2. Ensure your environment is running Python 2.7.
  3. Run git clone https://github.com/LightStage-Aber/LightStage-Repo.git to download the repo.
  4. Change directory into the repo.
  5. Run pip -r requirements.txt to install the Python dependencies
  6. Install Docker (either via their website install or) via the /vendor/install_docker.sh script in the repo.
  7. Run python run.py -m4 -e3 – to run the main visualisation application. This will:
    1. download the redis (95mb uncompressed) image and execute it via docker.
    2. start the API web service
    3. open a browser tab to the web service http://127.0.0.1:8080
  8. Next go to http://127.0.0.1:8080/baseline_intensities to view the default intensities data.
  9. Close the visualisation (or Ctrl+c on CLI) to shutdown the web service and data service. (Nb: the docker image will remain downloaded)

This gets the application in a ready state for delivering the data to your lighting controller on the same network via an HTTP client library.

Next stages – How-to Get it right:

The next stage is to configure the application to ensure correct and accurate data is returned. This involves (1) configuring a frame, (2) evaluate different lighting sets, (3) tune a lighting set and (4) add the tuned lighting set into the config file. Here’s a quick run-down of how to achieve the minimum of these tasks, each are set in the default.properties configuration file.

  • Set the frame (obj file).

For example, the AU Lightstage frame’s obj file can be configured in:

[FrameModel]
frame.objfilename=../models/dome/dome_c.obj
frame.scale=8
frame.withsupportaccess=True
frame.support_access_vertices_to_remove=1
  • Set the frame’s mapping from simulated index light position (from the obj file) to the actual hardware controller’s index light positions.

For example, the AU Lightstage mapping can be enabled using the following option. In future, mappings will be defined using a mapping file, e.g. (a=b),(b=c)…

[FrameModel]
frame.indexes_are_important=True
  • Set the tuned intensities data to be returned by the API.
[LightOutput]
light.output_intensity_from_index.enforce_default=False
light.output_intensity_from_index.column_number=1
light.output_intensity_from_index.skip_header=True
light.output_intensity_from_index.filename_path=<path/to/file.csv>

The input file format, is column-based. The precise format exported from the brightness control tuning (-m3) tool can be used directly, as specified. A working tuned output file, for the AU Lightstage, is as follows:

../results/Control_91-92_March2017/Results_Illuminance__Tuned_VertexIndexPositionEvaluator_44_92_0.0147788148161_2019-06-29-22-11-26.csv
  • Set the correct light set positioning data.

For example, the AU Lightstage uses python run.py -mX -e3 initialisation

[LightIndexPositions]
results_file.column_number=3
results_file.number_of_leds=44
results_file.csvfilename=../results/installed_aos+rod_July2016/installed_newlines_removed.csv

Because we use the -e3 switch, we do not need to configure the edge mounting, therefore the following is can be any value:

[FrameModel]
frame.number_of_vertices_per_edge=2

Having completed these configurations, we can return to Step 7. of the Example How-To. The web service API will then be serving the tuned baseline_intensity information for the AU Lightstage.

Conclusion

This first version of the communication interface API is presented. It responds with tuned intensities to improve lighting (illumination) balance in a light stage. The tuned intensity result data can be directly fed via the config file to supply via the HTTP API to low-level controller. Setting that config is a little tricky and an example how-to has been described above.

There’s a little more complication in the whole pipeline procedure including to (1) configure a frame, (2) evaluate different lighting sets, (3) tune a lighting set and (4) add the tuned lighting set into the config file. Step (4) let’s the HTTP interface communicate it back to a controller. It looks like a pipeline video tutorial is in order to help with this.

Future Todos

  • Add X-axis gradient (spherical) lighting sequence, from intensity baseline.
  • Frame mapping specification is needed via a mapping file. Format similar to [(a=b),(b=c),..].
  • Specify pre-made properties file as a command line argument, to more easily run repeating use cases.
  • A pipeline video tutorial or a GUI to simplify to explain each step, from light stage design ideation to guided lighting and capture sequence.

Credits for feature photo to Dominik Vanyi on Unsplash.

By Pete Scully PhD (UK)

Find my work on Research, Technology & Science... Past highlights include @PyCon, Data Science @Lexr.AI, Agri-Tech @MSU-TH, 3D Reconstruction Light Stages for Plants (Phenotypic/Genomic Mappings) @AU @NPPC-UK, Cybernetics, Social Science + Behavioural Research + Grant Writing @MU-TH, Model Explainability, IoT Apps for Zombies, Self-Healing AI + Cyber Security @Airbus, COVID-19 Thailand's Provincial Public Data Analysis, PlayfulCoding.EU, Training SW Engineers @MorganStanley, Commercial Projects + Start-ups.
To discuss or work together, I'm on email: pmdscully-at-gmail-com.

One reply on “Enabling Hardware-In-The-Loop”

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.