You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/user_guides/fs/compute_engines.md
+7-2
Original file line number
Diff line number
Diff line change
@@ -4,12 +4,12 @@ In order to execute a feature pipeline to write to the Feature Store, as well as
4
4
Hopsworks Feature Store APIs are built around dataframes, that means feature data is inserted into the Feature Store from a Dataframe and likewise when reading data from the Feature Store, it is returned
5
5
as a Dataframe.
6
6
7
-
As such, Hopsworks supports three computational engines:
7
+
As such, Hopsworks supports four computational engines:
8
8
9
9
1.[Apache Spark](https://spark.apache.org): Spark Dataframes and Spark Structured Streaming Dataframes are supported, both from Python environments (PySpark) and from Scala environments.
10
10
2.[Pandas](https://pandas.pydata.org/): For pure Python environments without dependencies on Spark, Hopsworks supports [Pandas Dataframes](https://pandas.pydata.org/).
11
11
3.[Apache Flink](https://flink.apache.org): Flink Data Streams are currently supported as an experimental feature from Java/Scala environments.
12
-
3.[Apache Beam](https://beam.apache.org/)*experimental*: Beam Data Streams are currently supported as an experimental feature from Java/Scala environments.
12
+
4.[Apache Beam](https://beam.apache.org/)*experimental*: Beam Data Streams are currently supported as an experimental feature from Java/Scala environments.
13
13
14
14
Hopsworks supports running [compute on the platform itself](../../concepts/dev/inside.md) in the form of [Jobs](../projects/jobs/pyspark_job.md) or in [Jupyter Notebooks](../projects/jupyter/python_notebook.md).
15
15
Alternatlively, you can also connect to Hopsworks using Python or Spark from [external environments](../../concepts/dev/outside.md), given that there is network connectivity.
@@ -76,3 +76,8 @@ Apache Beam integration with Hopsworks feature store was only tested using Dataf
76
76
77
77
For more details head over to the [Getting Started Guide](https://github.com/logicalclocks/hopsworks-tutorials/tree/master/integrations/java/beam).
78
78
79
+
## Java
80
+
It is also possible to interact to Hopsworks feature store using pure Java environments without dependencies on Spark, Flink or Beam.
81
+
However, this is limited to retrieval of feature vector(s) from the online Feature Store.
82
+
83
+
For more details head over to the [Getting Started Guide](https://github.com/logicalclocks/hopsworks-tutorials/tree/master/java).
0 commit comments