You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: official/nlp/README.md
+2-3Lines changed: 2 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -55,8 +55,7 @@ Layers are the fundamental building blocks for NLP models. They can be used to a
55
55
### Networks
56
56
57
57
Networks are combinations of `tf.keras` layers (and possibly other networks).
58
-
They are `tf.keras` models that would not be trained alone.
59
-
They are `tf.keras` models that would not be trained alone.It encapsulates common network structures like a transformer encoder into an easily handled object with a standardized configuration.
58
+
They are `tf.keras` models that would not be trained alone. It encapsulates common network structures like a transformer encoder into an easily handled object with a standardized configuration.
60
59
61
60
| Networks |
62
61
| -------------- |
@@ -66,7 +65,7 @@ They are `tf.keras` models that would not be trained alone.It encapsulates commo
66
65
67
66
### Models
68
67
69
-
Models are combinations of `tf.keras` layers and models that can be trained.Several pre-built canned models are provided to train encoder networks. These models are intended as both convenience functions and canonical examples.
68
+
Models are combinations of `tf.keras` layers and models that can be trained.Several pre-built canned models are provided to train encoder networks. These models are intended as both convenience functions and canonical examples.
0 commit comments