more useful docs
All checks were successful
Rust/sat-rs/pipeline/pr-main This commit looks good

This commit is contained in:
Robin Müller 2024-02-08 16:13:03 +01:00
parent 094c9502b2
commit f8f3d69402
Signed by: muellerr
GPG Key ID: A649FB78196E3849

View File

@ -20,24 +20,15 @@ Some additional explanation is provided for the various components:
The example includes a UDP and TCP server to receive telecommands and poll telemetry from. This The example includes a UDP and TCP server to receive telecommands and poll telemetry from. This
might be an optional component for an OBSW which is only used during the development phase on might be an optional component for an OBSW which is only used during the development phase on
ground. The TCP server parses space packets by using the CCSDS space packet ID as the packet ground. The UDP server is strongly based on the
start delimiter. [UDP TC server](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/hal/std/udp_server/struct.UdpTcServer.html).
This server component is wrapped by a TMTC server which handles all telemetry to the last connected
client.
### PUS Service Components The TCP server is based on the [TCP Spacepacket Server](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/hal/std/tcp_server/struct.TcpSpacepacketsServer.html)
class. It parses space packets by using the CCSDS space packet ID as the packet
A PUS service stack is provided which exposes some functionality conformant with the ECSS PUS start delimiter. All available telemetry will be sent back to a client after having read all
service. This currently includes the following services: telecommands from the client.
- Service 1 for telecommand verification.
- Service 3 for housekeeping telemetry handling.
- Service 5 for management and downlink of on-board events.
- Service 8 for handling on-board actions.
- Service 11 for scheduling telecommands to be released at a specific time.
- Service 17 for test purposes (pings)
### Event Management Component
An event manager is provided to handle the event IPC and FDIR mechanism.
### TMTC Infrastructure ### TMTC Infrastructure
@ -48,6 +39,37 @@ The most important components of the TMTC infrastructure include the following c
- A TM sink sink component which is the target of all sent telemetry and sends it to downlink - A TM sink sink component which is the target of all sent telemetry and sends it to downlink
handlers like the UDP and TCP server. handlers like the UDP and TCP server.
You can read the [Communications chapter](#communication-with-sat-rs-based-software) for more
background information on the chose TMTC infrastructure approach.
### PUS Service Components
A PUS service stack is provided which exposes some functionality conformant with the ECSS PUS
service. This currently includes the following services:
- Service 1 for telecommand verification. The verification handling is handled locally: Each
component which generates verification telemetry in some shape or form receives a
[reporter](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/pus/verification/struct.VerificationReporterWithSender.html)
object which can be used to send PUS 1 verification telemetry to the TM funnel.
- Service 3 for housekeeping telemetry handling.
- Service 5 for management and downlink of on-board events.
- Service 8 for handling on-board actions.
- Service 11 for scheduling telecommands to be released at a specific time. This component
uses the [PUS scheduler class](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/pus/scheduler/alloc_mod/struct.PusScheduler.html)
which performs the core logic of scheduling telecommands. All telecommands released by the
scheduler are sent to the central TC source via a message.
- Service 17 for test purposes like pings.
### Event Management Component
An [event manager]([Event Manager Component](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/event_man/index.html)
)
is provided to handle the event IPC and FDIR mechanism. The event message are converted to PUS 5
telemetry by the
[PUS event dispatcher](https://docs.rs/satrs-core/0.1.0-alpha.1/satrs_core/pus/event_man/alloc_mod/struct.PusEventDispatcher.html).
You can read the [events](#events) chapter for more in-depth information about event management.
### Sample Application Components ### Sample Application Components
These components are example mission specific. They provide an idea how mission specific modules These components are example mission specific. They provide an idea how mission specific modules
@ -60,3 +82,29 @@ would look like the sat-rs context. It currently includes the following componen
The interaction of the various components is provided in the following diagram: The interaction of the various components is provided in the following diagram:
![satrs-example dataflow diagram](images/satrs-example/satrs-example-dataflow.png) ![satrs-example dataflow diagram](images/satrs-example/satrs-example-dataflow.png)
An explanation for important component groups will be given
#### TMTC component group
This groups is the primary interface for clients to communicate with the on-board software
using a standardized TMTC protocol. The example uses the
[ECSS PUS protocol](https://ecss.nl/standard/ecss-e-st-70-41c-space-engineering-telemetry-and-telecommand-packet-utilization-15-april-2016/).
In the future, this might be extended with the
[CCSDS File Delivery Protocol](https://public.ccsds.org/Pubs/727x0b5.pdf).
A client can connect to the UDP or TCP server to send these PUS packets to the on-board software.
These servers then forwards the telecommads to a centralized TC source component using a PUS TC
message.
This TC source component then demultiplexes the message and forwards it to the relevant component.
Right now, it forwards all PUS requests to the respective PUS service handlers, which run in a
separate thread. In the future, additional forwarding to components like a CFDP handler might be
added as well.
All telemetry generated by the on-board software is sent to a centralized TM funnel. This component
also performs a demultiplexing step to forward all telemetry to the relevant TM recipients.
In the example case, this is the last UDP client, or a connected TCP client. In the future,
a forwarding to a persistent telemetry store and a simulated communication component might be
added here as well. The centralized TM funnel also takes care of some packet processing steps which
need to be applied for each ECSS PUS packet.