diff --git a/README.md b/README.md
index 1c2237946..71bfb35c6 100644
--- a/README.md
+++ b/README.md
@@ -74,13 +74,13 @@ You can link against this library in your program at the following coordinates:
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.11 version: 2.6.10
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.11 version: 2.6.11
|
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.12 version: 2.6.10
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.12 version: 2.6.11
|
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.13 version: 2.6.10
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.13 version: 2.6.11
|
@@ -91,17 +91,17 @@ This package can be added to Spark using the `--packages` command line option. F
### Spark compiled with Scala 2.11
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.6.10
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.6.11
```
### Spark compiled with Scala 2.12
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.10
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.11
```
### Spark compiled with Scala 2.13
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.6.10
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.6.11
```
## Usage
@@ -238,17 +238,17 @@ to decode various binary formats.
The jars that you need to get are:
-* spark-cobol_2.12-2.6.10.jar
-* cobol-parser_2.12-2.6.10.jar
+* spark-cobol_2.12-2.6.11.jar
+* cobol-parser_2.12-2.6.11.jar
* scodec-core_2.12-1.10.3.jar
* scodec-bits_2.12-1.1.4.jar
* antlr4-runtime-4.8.jar
After that you can specify these jars in `spark-shell` command line. Here is an example:
```
-$ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.10
+$ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.6.11
or
-$ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.6.10.jar,cobol-parser_2.12-2.6.10.jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar,antlr4-runtime-4.8.jar
+$ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.6.11.jar,cobol-parser_2.12-2.6.11.jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar,antlr4-runtime-4.8.jar
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@@ -315,11 +315,11 @@ Creating an uber jar for Cobrix is very easy. Steps to build:
You can collect the uber jar of `spark-cobol` either at
`spark-cobol/target/scala-2.11/` or in `spark-cobol/target/scala-2.12/` depending on the Scala version you used.
-The fat jar will have '-bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.6.10
+The fat jar will have '-bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.6.11
Then, run `spark-shell` or `spark-submit` adding the fat jar as the option.
```sh
-$ spark-shell --jars spark-cobol_2.12_3.3-2.6.11-SNAPSHOT-bundle.jar
+$ spark-shell --jars spark-cobol_2.12_3.3-2.6.12-SNAPSHOT-bundle.jar
```
> A note for building and running tests on Windows
@@ -1737,6 +1737,9 @@ A: Update hadoop dll to version 3.2.2 or newer.
Older versions
+- #### 2.4.11 released 8 April 2022.
+ - [#659](https://github.com/AbsaOSS/cobrix/issues/659) Fixed record length option when record id generation is turned on.
+
- #### 2.4.10 released 8 April 2022.
- [#481](https://github.com/AbsaOSS/cobrix/issues/481) ASCII control characters are now ignored instead of being replaced with spaces.
A new string trimming policy (`keep_all`) allows keeping all control characters in strings (including `0x00`).