BIS spaceflight simulators – part 1

by Fabrizio Bernardini, FBIS

Introduction

First prototype of the Soyuz simulator with wireframe graphics.

The development of our first spaceflight simulators started in 2013 as an experiment to see how it could be possible to give the public a good sense of flying a spacecraft. We wanted the user to experience piloting with six degrees of freedom, the need to do things slowly and deliberately thinking ahead of the vehicle, understanding the procedural aspects involved, and of course we wanted to provide some intelligent fun. More importantly we also wanted to be technically accurate, within the limits of practical constrains, which led to evolving such projects also as a high-level educational experience, well-suited to help train future engineers.

When it comes to hands-on flying, the phase of a spaceflight in which the pilot-astronaut really uses his/her spacecraft handling skills is when performing the rendezvous and docking with another vehicle in orbit. The ascent to orbit, while monitored by the pilot, is totally automatic. For re-entry and landing there are a number of alternatives, but they all involve aerodynamics, but for landing on the Moon or other airless celestial body, which is more complex to handle. In terms of complexity we decided therefore to address initially only orbital flight.

The Soyuz-TMA approach and docking simulator

When we started the project, the Soyuz TMA spacecraft was the only crewed vehicle in operation (the Space Shuttle being relegated to museums and commercial space options still something to be demonstrated). The availability of a full set of Soyuz training documents, to let us understand the characteristics of the vehicle and of its control system, and the fact that more often than not pilot astronauts conducted the final approach to docking in a manual mode, instead of automatically, led us to start working on that vehicle.

Our very first test aimed at selecting the right software environment and was based on a generic vehicle with NASA-style flight controls (RHC, Rotational Hand Controller, and THC; Translational Hand Controller, the latter built from scratch). While for the RHC we used a standard three-axis joystick, for the THC we had to develop our own mechanical contraption starting from an arcade game two-axis joystick and adding to it a third axis. For the external view the simulator software implemented a very crude wireframe graphic rendering of the target station, very basic but still interesting enough to present it to the public.

ESA astronaut Roberto Vittori helping kids perform a Soyuz docking during ERN 2013 in ESA/ESRIN

This simulator was also tested, in different situations, by ESA astronauts Roberto Vittori and Paolo Nespoli, who both provided encouragement and technical advice.

The dynamics of the vehicle were simulated with the direct application of the equation of motions. Without solving the entire problem of orbital flight, we simulated only proximity operations, when the dynamical behaviour con be linearly approximated, or be modelled using the Clohessy-Wiltshire equations. With the idea of relying on the simplest possible software implementation we used Python as main programming language, even if its performances are barely acceptable in a real-time simulation. Even with this limitation, a lot of functionalities are part of the core application, including the management of flight controls (for rotation and translation), a user interface that mimic the Soyuz control panel, the computation of user data to display where appropriate.

After the first experiments, which demonstrated the validity and utility of the simulator, we developed a new external view using a serious rendering engine. For the external view we used the Unity engine for a basic view of a generic space station based on the Russian section of the ISS (the representation is similar to the early stages of ISS integration). The external view is controlled by the main simulator software receive data continuously from it via LAN. This view also displays some user data superimposed to the rendered view (a bit like a head-up display) in a fashion very similar to the Soyuz camera’s view. A dedicated computer is needed to manage this view.

A new 3D rendered view is added to the simulator, with a separate computer to run it

Starting from this version, the full simulator architecture was redefined and made more modular. The main simulator software does not require a powerful computer if the graphic engine for the external view is external. Having a networked architecture permits also to expand functionalities without breaking the existing architecture too much, which also simplify testing and troubleshooting.

Also the software architecture had been improved to make it more easy to implement modifications. The main logic is an endless loop (called “simulation loop”) which has to execute a number of tasks (variable) within a given deadline. If the deadline is not met, the simulator would still work, but with reduced performances.

To make the simulator more similar to the Soyuz spacecraft we had to use Soyuz-like flight controllers. These are so particular that we had to build them from scratch, using an Arduino microcontroller that implements the HID interface (that is, it is recognizable by any computer like if it is a standard joystick) connected to the simulator software. There are two flight controllers: one for translations, and one for rotations.

Overall architecture of the simulator

The resulting hardware looks similar to the original ones even if they are still far from being perfect. Some aspects of the flight controllers are simulated (we do not have variable force springs, or detents) but, overall, they feel pretty good. The electromechanical parts used are from a Chinese producer of different kinds of hand controls, the most difficult one being the rotational controller.

Soyuz-like flight controllers, developed from scratch

While the whole simulator could run as described, as a collection of computers, monitors and the flight controllers’ hardware, the whole experience assumes a different dimension when all is integrated behind a real-size photographic replica of the Soyuz main control panel. This has been the next step that helped give the simulator an additional dimension of realism that every user appreciated.

ESA astronaut Paolo Nespoli provided many personal inputs to verify the dynamical models of the simulator

To build this we found and bought an undistorted high-resolution photo of a Soyuz TMA main control panel (which is called Neptune). The Soyuz TMA implemented the digital displays and a lot of very interesting updates to the spacecraft avionics while preserving its onboard systems heritage. The photo has been printed on adhesive paper in real size (thanks to ESA astronaut Paolo Nespoli which found for us those sizes) and stuck to a wooden panel with supporting frame. The final result is outstanding: seen from a distance it looks real!

Finding the LCD displays of the right size has been very difficult but with some tricks we were able to fit them into the holes of the panel respecting more or less their original size. The left LCD is the console of the simulator showing also a Soyuz-inspired user interface. The centre LCD shows the camera view full screen, with the numerical data.

Another view is provided by an additional display, positioned under the main panel, to replicate the Soyuz’s periscope view, which is actually the main view used by the pilot-astronaut during manual docking operations. A dedicated miniPC computer is required for each external view so the entire simulator now runs with:

  • one Raspberry Pi computer, or any small Linux computer, for the main simulator
  • two miniPCs for the external views
  • one LAN hub to interconnect everything together

It is noted that the camera view is not aligned with the dedicated camera docking target on the ISS because the 3D model we were using didn’t include it and we had no way to update it (one of the problems with working with volunteers that disappear over time). So, the camera view is actually aligned with the same docking target used for the periscope. A minor compromise that few of the public would note.

The current version of the simulation software replicates different control modes (some not present on the Soyuz have been added for research purposes) and the dynamic responses are computed against original data on the S/C and the thrusters. The overall behaviour has been tested by a few astronauts, but in particular by Paolo Nespoli who spent a few hours helping in different occasions even using his own training notes to verify our work.

Soyuz docking simulator, one of two existing copies.

In a future version of the Soyuz simulator we would like to better replicate the display formats shown in the two main displays. The tricky one is the central display which during rendezvous and docking operations shows the spacecraft Format 44 which mixes an external view (from the camera) with interface elements to control and monitor the vehicle. At the same time we would like to re-design the external views, with a more complete model of the ISS and its full dynamic simulation.

Wait, there is more …

BIS spaceflight simulator are always under continuous development and they are a good proving ground for students and enthusiasts who want to acquire skills or make available their professional experience to help us improve them. Please contact the BIS if you want to contribute.

We also have two other simulators in development, and one has been already used in many events: the Apollo CSM Flight Control System simulator. You will be able to read about this other project in a next article in this short series!

2024-04 (submitted 01/04/2024)