Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revised Documentation for Accuracy #25

Merged
merged 17 commits into from
May 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
3d02392
Fixed a typo in the README (Metatdata -> Metadata)
dmccoystephenson May 8, 2024
9f6e025
Adjusted headers in README to improve clarity
dmccoystephenson May 8, 2024
e46229a
Removed outdated 'Project Management' section in README
dmccoystephenson May 8, 2024
e321d24
Moved 'Confluent Cloud Integration' section to bottom of README
dmccoystephenson May 8, 2024
401ffe5
Adjusted headers & added table of contents to `installation.md`
dmccoystephenson May 8, 2024
223c81c
Moved 'Docker Installation' section to top of `installation.md`
dmccoystephenson May 8, 2024
65bf902
Revised docker & manual installation documentation in `installation.md`
dmccoystephenson May 8, 2024
84d39ea
Adjusted headers in `interface.md`
dmccoystephenson May 8, 2024
509c835
Fixed typo in `interface.md` (heirarchy -> hierarchy)
dmccoystephenson May 8, 2024
07da37c
Fixed a typo in `configuration.md` (Metatdata -> Metadata)
dmccoystephenson May 8, 2024
9ac4e23
Adjusted headers in `configuration.md`
dmccoystephenson May 8, 2024
727bfc7
Revised 'ACM Testing with Kafka' section of `configuration.md`
dmccoystephenson May 8, 2024
1982b76
Reorganized `testing.md` and added docker info
dmccoystephenson May 8, 2024
5ed3d97
Added more instructions on env file handling to `installation.md`
dmccoystephenson May 8, 2024
03332b3
Revised blurb in 'ACM Testing with Kafka' section of `configuration.md`
dmccoystephenson May 9, 2024
6d5ff25
Added section on environment variables to `configuration.md`
dmccoystephenson May 9, 2024
57d1c7c
Added link to 'Environment Variables' section of `configuration.md` t…
dmccoystephenson May 9, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 24 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ one of two functions depending on how it is started:

1. **Decode**: This function is used to process messages *from* the connected
vehicle environment *to* ODE subscribers. Specifically, the ACM extacts binary
data from consumed messages (ODE Metatdata Messages) and decodes the binary
data from consumed messages (ODE Metadata Messages) and decodes the binary
ASN.1 data into a structure that is subsequently encoded into an alternative
format more suitable for ODE subscribers (currently XML using XER).

Expand All @@ -17,22 +17,25 @@ is subsequently *encoded into ASN.1 binary data*.

![ASN.1 Codec Operations](docs/graphics/asn1codec-operations.png)

# Table of Contents
## Table of Contents

**README.md**
1. [Release Notes](#release-notes)
1. [Getting Involved](#getting-involved)
1. [Documentation](#documentation)
1. [Generating C Files from ASN.1 Definitions](#generating-c-files-from-asn1-definitions)
1. [Confluent Cloud Integration](#confluent-cloud-integration)

**Other Documents**
1. [Installation](docs/installation.md)
1. [Configuration and Operation](docs/configuration.md)
1. [Interface](docs/interface.md)
1. [Testing](docs/testing.md)
1. [Project Management](#project-management)
1. [Confluent Cloud Integration](#confluent-cloud-integration)

## Release Notes
The current version and release history of the asn1_codec: [asn1_codec Release Notes](<docs/Release_notes.md>)

# Getting Involved
## Getting Involved

This project is sponsored by the U.S. Department of Transportation and supports Operational Data Environment data type
conversions. Here are some ways you can start getting involved in this project:
Expand All @@ -42,7 +45,7 @@ conversions. Here are some ways you can start getting involved in this project:
- If you would like to improve this code base or the documentation, [fork the project](https://github.com/usdot-jpo-ode/asn1_codec#fork-destination-box) and submit a pull request.
- If you find a problem with the code or the documentation, please submit an [issue](https://github.com/usdot-jpo-ode/asn1_codec/issues/new).

## Introduction
### Introduction

This project uses the [Pull Request Model](https://help.github.com/articles/using-pull-requests). This involves the following project components:

Expand All @@ -56,7 +59,7 @@ request to the organization asn1_codec project. One the project's main developer
or, if there are issues, discuss them with the submitter. This will ensure that the developers have a better
understanding of the code base *and* we catch problems before they enter `master`. The following process should be followed:

## Initial Setup
### Initial Setup

1. If you do not have one yet, create a personal (or organization) account on GitHub (assume your account name is `<your-github-account-name>`).
1. Log into your personal (or organization) account.
Expand All @@ -66,19 +69,19 @@ understanding of the code base *and* we catch problems before they enter `master
$ git clone https://github.com/<your-github-account-name>/asn1_codec.git
```

## Additional Resources for Initial Setup
### Additional Resources for Initial Setup

- [About Git Version Control](http://git-scm.com/book/en/v2/Getting-Started-About-Version-Control)
- [First-Time Git Setup](http://git-scm.com/book/en/Getting-Started-First-Time-Git-Setup)
- [Article on Forking](https://help.github.com/articles/fork-a-repo)

# Documentation
## Documentation

This documentation is in the `README.md` file. Additional information can be found using the links in the [Table of
Contents](#table-of-contents). All stakeholders are invited to provide input to these documents. Stakeholders should
direct all input on this document to the JPO Product Owner at DOT, FHWA, or JPO.

## Code Documentation
### Code Documentation

Code documentation can be generated using [Doxygen](https://www.doxygen.org) by following the commands below:

Expand All @@ -89,36 +92,30 @@ $ doxygen
```

The documentation is in HTML and is written to the `<install root>/asn1_codec/docs/html` directory. Open `index.html` in a
browser.

## Project Management
browser.

This project is managed using the Jira tool.
## Generating C Files from ASN.1 Definitions
Check here for instructions on how to generate C files from ASN.1 definitions: [ASN.1 C File Generation](asn1c_combined/README.md)

- [Jira Project Portal](https://usdotjpoode.atlassian.net/secure/Dashboard.jsp)
This should only be necessary if the ASN.1 definitions change. The generated files are already included in the repository.

# Confluent Cloud Integration
## Confluent Cloud Integration
Rather than using a local kafka instance, the ACM can utilize an instance of kafka hosted by Confluent Cloud via SASL.

## Environment variables
### Purpose & Usage
### Environment variables
#### Purpose & Usage
- The DOCKER_HOST_IP environment variable is used to communicate with the bootstrap server that the instance of Kafka is running on.
- The KAFKA_TYPE environment variable specifies what type of kafka connection will be attempted and is used to check if Confluent should be utilized.
- The CONFLUENT_KEY and CONFLUENT_SECRET environment variables are used to authenticate with the bootstrap server.

### Values
#### Values
- DOCKER_HOST_IP must be set to the bootstrap server address (excluding the port)
- KAFKA_TYPE must be set to "CONFLUENT"
- CONFLUENT_KEY must be set to the API key being utilized for CC
- CONFLUENT_SECRET must be set to the API secret being utilized for CC

## CC Docker Compose File
### CC Docker Compose File
There is a provided docker-compose file (docker-compose-confluent-cloud.yml) that passes the above environment variables into the container that gets created. Further, this file doesn't spin up a local kafka instance since it is not required.

## Note
This has only been tested with Confluent Cloud but technically all SASL authenticated Kafka brokers can be reached using this method.

# Generating C Files from ASN.1 Definitions
Check here for instructions on how to generate C files from ASN.1 definitions: [ASN.1 C File Generation](asn1c_combined/README.md)

This should only be necessary if the ASN.1 definitions change. The generated files are already included in the repository.
### Note
This has only been tested with Confluent Cloud but technically all SASL authenticated Kafka brokers can be reached using this method.
61 changes: 40 additions & 21 deletions docs/configuration.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# ACM Operation
# ACM Configuration & Operation

The ASN.1 Codec Module (ACM) processes Kafka data streams that preset [ODE
Metadata](http://github.com/usdot-jpo-ode/jpo-ode/blob/develop/docs/Metadata_v3.md) wrapped ASN.1 data. It can perform
one of two functions depending on how it is started:

1. **Decode**: This function is used to process messages *from* the connected
vehicle environment *to* ODE subscribers. Specifically, the ACM extacts binary
data from consumed messages (ODE Metatdata Messages) and decodes the binary
data from consumed messages (ODE Metadata Messages) and decodes the binary
ASN.1 data into a structure that is subsequently encoded into an alternative
format more suitable for ODE subscribers (currently XML using XER).

Expand All @@ -15,7 +15,7 @@ the connected vehicle environment. Specifically, the ACM extracts
human-readable data from ODE Metadata and decodes it into a structure that
is subsequently *encoded into ASN.1 binary data*.

![ASN.1 Codec Operations](https://github.com/usdot-jpo-ode/asn1_codec/blob/master/docs/graphics/asn1codec-operations.png)
![ASN.1 Codec Operations](graphics/asn1codec-operations.png)

## ACM Command Line Options

Expand All @@ -41,7 +41,23 @@ line options override parameters specified in the configuration file.** The foll
-v | --log-level : The info log level [trace,debug,info,warning,error,critical,off]
```

# ACM Deployment
## Environment Variables
The following environment variables are used by the ACM:

-------------------
| Variable | Description |
| --- | --- |
| `DOCKER_HOST_IP` | The IP address of the machine running the kafka cluster. |
| `ACM_LOG_TO_CONSOLE` | Whether or not to log to the console. |
| `ACM_LOG_TO_FILE` | Whether or not to log to a file. |
| `ACM_LOG_LEVEL` | The log level to use. Valid values are: "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL", "OFF" |
| `KAFKA_TYPE` | If unset, a local kafka broker will be targeted. If set to "CONFLUENT", the application will target a Confluent Cloud cluster. |
| `CONFLUENT_KEY` | Confluent Cloud Integration (if KAFKA_TYPE is set to "CONFLUENT") |
| `CONFLUENT_SECRET` | Confluent Cloud Integration (if KAFKA_TYPE is set to "CONFLUENT") |

The `sample.env` file contains the default values for some of these environment variables. To use these values, copy the `sample.env` file to `.env` and modify the values as needed.

## ACM Deployment

Once the ACM is [installed and configured](installation.md) it operates as a background service. The ACM can be started
before or after other services. If started before the other services, it may produce some error messages while it waits
Expand All @@ -55,7 +71,7 @@ Multiple ACM processes can be started. Each ACM will use its own configuration f
deploy a decoder and an encoder, start two separate ACM services with different `-T` options. In this case different
topics should be specified in the configuration files.

# ACM ASN.1 Data Sources
## ACM ASN.1 Data Sources

The ACM receives XML data from the ODE; the schema for this XML message is described in
[Metadata.md](https://github.com/usdot-jpo-ode/jpo-ode/blob/develop/docs/Metadata_v3.md) on the ODE. Currently, the ACM
Expand Down Expand Up @@ -84,23 +100,23 @@ can handle:

\* Denotes the message should already contain hex data, according the the ASN.1 specification for that message.
For instance, IEEE 1609.2 must contain hex data in its `unsecuredData` tag. If the hex data is missing or invalid,
the ACM with likely generate an error when doing constraint checking.
the ACM will likely generate an error when doing constraint checking.

- After ENCODING this text is changed to: `us.dot.its.jpo.ode.model.OdeHexByteArray`
- After DECODING this text is changed to: `us.dot.its.jpo.ode.model.OdeXml`

Both the ENCODER and DECODER will check the ASN.1 constraints for the C structures that are built as data passes through
the module.

# ACM Kafka Limitations
## ACM Kafka Limitations

With regard to the Apache Kafka architecture, each ACM process does **not** provide a way to take advantage of Kafka's scalable
architecture. In other words, each ACM process will consume data from a single Kafka topic and a single partition within
that topic. One way to consume topics with multiple partitions is to launch one ACM process for each partition; the
configuration file will allow you to designate the partition. In the future, the ACM may be updated to automatically
handle multiple partitions within a single topic.

# ACM Logging
## ACM Logging

ACM operations are optionally logged to the console and/or to a file. The file is a rotating log file, i.e., a set number of log files will
be used to record the ACM's information. By default, the file is in a `logs` directory from where the ACM is launched and the file is
Expand Down Expand Up @@ -136,7 +152,7 @@ preceeded with a date and time stamp and the level of the log message.
[171011 18:25:55.221442] [trace] ending configure()
```

# ACM Configuration
## ACM Configuration

The ACM configuration file is a text file with a prescribed format. It can be used to configure Kafka as well as the ACM.
Comments can be added to the configuration file by starting a line with the '#' character. Configuration lines consist
Expand Down Expand Up @@ -171,7 +187,7 @@ Example configuration files can be found in the [asn1_codec/config](../config) d

The details of the settings and how they affect the function of the ACM follow:

## ODE Kafka Interface
### ODE Kafka Interface

- `asn1.topic.producer` : The Kafka topic name where the ACM will write its output. **The name is case sensitive.**

Expand All @@ -194,20 +210,18 @@ The details of the settings and how they affect the function of the ACM follow:

- `compression.type` : The type of compression to use for writing to Kafka topics. Currently, this should be set to none.

# ACM Testing with Kafka
## ACM Testing with Kafka

There are four steps that need to be started / run as separate processes.
The necessary services for testing the ACM with Kafka are provided in the `docker-compose.yml` file. The following steps will guide you through the process of testing the ACM with Kafka.

1. Start the `kafka-docker` container to provide the basic kafka data streaming services.

```bash
$ docker-compose up --no-recreate -d
1. Start the Kafka & ACM services via the provided `docker-compose.yml` file.
```
$ docker compose up --build -d
```

1. Start the ACM (here we are starting a decoder).

```bash
$ ./acm -c config/example.properties -T decode
1. Exec into the Kafka container to gain access to the Kafka command line tools.
```
$ docker exec -it asn1_codec_kafka_1 /bin/bash
```

1. Use the provided `kafka-console-producer.sh` script (provided with the Apache Kafka installation) to send XML
Expand All @@ -225,4 +239,9 @@ $ ./bin/kafka-console-consumer.sh --bootstrap-server ${SERVER_IP} --topic ${ACM_
```

The log files will provide more information about the details of the processing that is taking place and document any
errors.
errors. To view log files, exec into the ACM container and use the `tail` command to view the log file.

```bash
$ docker exec -it asn1_codec_acm_1 /bin/bash
$ tail -f logs/log
```
Loading
Loading