You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"""Persist the metadata and materialize the feature group to the feature store
2829
2830
or insert data from a dataframe into the existing feature group.
@@ -2932,6 +2933,7 @@ def insert(
2932
2933
Shortcut for read_options `{"wait_for_job": False}`.
2933
2934
transformation_context: `Dict[str, Any]` A dictionary mapping variable names to objects that will be provided as contextual information to the transformation function at runtime.
2934
2935
These variables must be explicitly defined as parameters in the transformation function to be accessible during execution. If no context variables are provided, this parameter defaults to `None`.
2936
+
transform: `bool`. When set to `False`, the dataframe is inserted without applying any on-demand transformations. In this case, all required on-demand features must already exist in the provided dataframe. Defaults to `True`.
2935
2937
2936
2938
# Returns
2937
2939
(`Job`, `ValidationReport`) A tuple with job information if python engine is used and the validation report if validation is enabled.
to control whether the expectation suite of the feature group should be fetched before every insert.
3099
3103
transformation_context: `Dict[str, Any]` A dictionary mapping variable names to objects that will be provided as contextual information to the transformation function at runtime.
3100
3104
These variables must be explicitly defined as parameters in the transformation function to be accessible during execution. If no context variables are provided, this parameter defaults to `None`.
3105
+
transform: `bool`. When set to `False`, the dataframe is inserted without applying any on-demand transformations. In this case, all required on-demand features must already exist in the provided dataframe. Defaults to `True`.
3101
3106
3102
3107
# Returns
3103
3108
(`Job`, `ValidationReport`) A tuple with job information if python engine is used and the validation report if validation is enabled.
@@ -3117,6 +3122,7 @@ def multi_part_insert(
3117
3122
write_optionsor {},
3118
3123
validation_optionsor {},
3119
3124
transformation_context,
3125
+
transform=transform,
3120
3126
)
3121
3127
3122
3128
deffinalize_multi_part_insert(self) ->None:
@@ -3160,6 +3166,7 @@ def insert_stream(
3160
3166
checkpoint_dir: Optional[str] =None,
3161
3167
write_options: Optional[Dict[str, Any]] =None,
3162
3168
transformation_context: Dict[str, Any] =None,
3169
+
transform: bool=True,
3163
3170
) ->TypeVar("StreamingQuery"):
3164
3171
"""Ingest a Spark Structured Streaming Dataframe to the online feature store.
3165
3172
@@ -3215,6 +3222,7 @@ def insert_stream(
3215
3222
Defaults to `{}`.
3216
3223
transformation_context: `Dict[str, Any]` A dictionary mapping variable names to objects that will be provided as contextual information to the transformation function at runtime.
3217
3224
These variables must be explicitly defined as parameters in the transformation function to be accessible during execution. If no context variables are provided, this parameter defaults to `None`.
3225
+
transform: `bool`. When set to `False`, the dataframe is inserted without applying any on-demand transformations. In this case, all required on-demand features must already exist in the provided dataframe. Defaults to `True`.
0 commit comments