Skip to content

Commit 340e157

Browse files
authored
added new image (#1645)
* added new image * Updated the Fabric instructions * new images added * Image titles added
1 parent 5bbed56 commit 340e157

File tree

6 files changed

+28
-13
lines changed

6 files changed

+28
-13
lines changed
157 KB
Loading
161 KB
Loading
179 KB
Loading
207 KB
Loading
78.3 KB
Loading

docs/en/licensed_install.md

Lines changed: 28 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1589,36 +1589,36 @@ Navigate to [MS Fabric](https://app.fabric.microsoft.com/) and sign in with your
15891589
</div><div class="h3-box" markdown="1">
15901590
15911591
### Step 2: Create a Lakehouse
1592-
- Go to the **Synapse Data Science** section.
1592+
- Go to the **Data Science** section.
15931593
- Navigate to the **Create** section.
15941594
- Create a new lakehouse, (for instance let us name it `jsl_workspace`.)
15951595
1596-
![image](/assets/images/installation/355920557-2c5f778c-4c33-4a54-af21-71f4486f5e4b.webp)
1596+
![Create a Lakehouse](/assets/images/installation/Fabric_1.png)
15971597
15981598
</div><div class="h3-box" markdown="1">
15991599
16001600
### Step 3: Create a Notebook
16011601
- Similarly, create a new notebook ( for instance let us name it `JSL_Notebook`.)
16021602
1603-
![image](/assets/images/installation/355920928-697cac4b-29ff-4f23-beaa-5aaa32569ff0.webp)
1603+
![Create a Notebook in Fabric](/assets/images/installation/Fabric_2.png)
16041604
16051605
</div><div class="h3-box" markdown="1">
16061606
16071607
### Step 4: Attach the Lakehouse
16081608
Attach the newly created lakehouse (`jsl_workspace`) to your notebook.
16091609
1610-
![image](/assets/images/installation/355921285-63996c40-4cd6-4aa2-925f-a1ad886914f4.webp)
1610+
![Attach the Lakehouse](/assets/images/installation/355921285-63996c40-4cd6-4aa2-925f-a1ad886914f4.webp)
16111611
1612-
![image](/assets/images/installation/355921392-b711eef6-55ed-4073-b974-14b565cd40be.webp)
1612+
![Attach the Lakehouse](/assets/images/installation/355921392-b711eef6-55ed-4073-b974-14b565cd40be.webp)
16131613
16141614
</div><div class="h3-box" markdown="1">
16151615
16161616
### Step 5: Upload Files
16171617
Upload the necessary `.jar` and `.whl` files to the attached lakehouse.
16181618
1619-
![image](/assets/images/installation/355921637-a275d80d-768f-4402-bdab-d95864e73690.webp)
1619+
![Upload Files to Fabric](/assets/images/installation/355921637-a275d80d-768f-4402-bdab-d95864e73690.webp)
16201620
1621-
![image](/assets/images/installation/360943582-53bc84ae-40dc-41dc-9522-e87bf70d4fba.webp)
1621+
![Upload Files to Fabric](/assets/images/installation/Fabric_3.png)
16221622
16231623
After uploading is complete, you can configure and run the notebook.
16241624
@@ -1631,21 +1631,30 @@ Configure the session within the notebook as follows:
16311631
%%configure -f
16321632
{
16331633
"conf": {
1634-
"spark.hadoop.fs.s3a.access.key": {
1634+
"spark.jsl.settings.aws.credentials.access_key_id": {
16351635
"parameterName": "awsAccessKey",
1636-
"defaultValue": "<AWS-ACCESS-KEY>"
1636+
"defaultValue": "<AWS_ACCESS_KEY_ID>"
16371637
},
1638-
"spark.hadoop.fs.s3a.secret.key": {
1638+
"spark.jsl.settings.aws.credentials.secret_access_key": {
16391639
"parameterName": "awsSecretKey",
1640-
"defaultValue": "<AWS-SECRET-KEY>"
1640+
"defaultValue": "<AWS_SECRET_ACCESS_KEY>"
16411641
},
1642+
16421643
"spark.yarn.appMasterEnv.SPARK_NLP_LICENSE": {
16431644
"parameterName": "sparkNlpLicense",
1644-
"defaultValue": "<LICENSE-KEY>"
1645+
"defaultValue": "<SPARK_NLP_LICENSE>"
16451646
},
16461647
"spark.jars": {
16471648
"parameterName": "sparkJars",
1648-
"defaultValue": "<abfs-path-spark-nlp-assembly-jar>,<abfs-path-spark-nlp-jsl-jar>"
1649+
"defaultValue": "abfss://&&&&&&/Files/spark-nlp-assembly-5.5.0.jar, abfss://&&&&&&/Files/spark-nlp-jsl-5.5.0.jar"
1650+
},
1651+
"spark.jsl.settings.pretrained.cache_folder": {
1652+
"parameterName": "cacheFolder",
1653+
"defaultValue": "abfss://&&&&&&/Files/unzip_files"
1654+
},
1655+
"spark.extraListeners": {
1656+
"parameterName": "extraListener",
1657+
"defaultvalue": "com.johnsnowlabs.license.LicenseLifeCycleManager"
16491658
}
16501659
}
16511660
}
@@ -1658,6 +1667,7 @@ Configure the session within the notebook as follows:
16581667
16591668
Install the required Spark NLP libraries using pip commands:
16601669
```bash
1670+
%pip install <johnsnowlabs whl File API path>
16611671
%pip install <spark-nlp whl File API path>
16621672
%pip install <spark-nlp-jsl whl File API path>
16631673
```
@@ -1754,4 +1764,9 @@ result = pipeline.annotate(text)
17541764
17551765
![Load the Model and Make Predictions](/assets/images/installation/355924362-f62b4bc5-96ee-41d5-a80b-887766b252c9.webp)
17561766
1767+
### Step 12: Run the pipeline with `.pretrained()` method
1768+
You can also run the pipelines without using the `.load()` or `.from_disk()` methods
1769+
1770+
![Run the pipeline with `.pretrained()` method](/assets/images/installation/Fabric_4.png)
1771+
![Run the pipeline with `.pretrained()` method](/assets/images/installation/Fabric_5.png)
17571772
</div>

0 commit comments

Comments
 (0)