Skip to content

[FEATURE] Shims or type stubs for pyspark.databricks #971

Open
@tigerhawkvok

Description

@tigerhawkvok

Problem Statement
When developing locally, an import to pyspark.databricks doesn't have type hints or code completion

Proposed Solution
type stubs -- ideally ones that redirect to the real pyspark.databricks functions -- would be very helpful.

So databricks.sdk.pyspark_databricks would (recursively) shim access to members of pyspark.databricks.

It means something like

from databricks.sdk.pyspark_databricks.sql import functions as dbf

and

from pyspark.databricks.sql import functions as dbf

would be identical, except the former would provide hints locally whereas the latter does not.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions