Skip to content

Commit

Permalink
Added docstrings
Browse files Browse the repository at this point in the history
  • Loading branch information
Ark-kun committed Oct 19, 2021
1 parent 023c688 commit d346c1d
Showing 1 changed file with 346 additions and 0 deletions.
346 changes: 346 additions & 0 deletions google/cloud/aiplatform/models.py
Expand Up @@ -2453,6 +2453,120 @@ def upload_xgboost_model_file(
staging_bucket: Optional[str] = None,
sync=True,
):
"""Uploads a model and returns a Model representing the uploaded Model
resource.
Note: This function is *experimental* and can be changed in the future.
Example usage::
my_model = Model.upload_xgboost_model_file(
model_file_path="iris.xgboost_model.bst"
)
Args:
model_file_path (str): Required. Local file path of the model.
xgboost_version (str): Optional. The version of the XGBoost serving container.
Supported versions: ["0.82", "0.90", "1.1", "1.2", "1.3", "1.4"].
If the version is not specified, the latest version is used.
display_name (str):
Optional. The display name of the Model. The name can be up to 128
characters long and can be consist of any UTF-8 characters.
description (str):
The description of the model.
instance_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single instance, which
are used in
``PredictRequest.instances``,
``ExplainRequest.instances``
and
``BatchPredictionJob.input_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
parameters_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the parameters of prediction and
explanation via
``PredictRequest.parameters``,
``ExplainRequest.parameters``
and
``BatchPredictionJob.model_parameters``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform, if no parameters are supported it is set to an
empty string. Note: The URI given on output will be
immutable and probably different, including the URI scheme,
than the one given on input. The output URI will point to a
location where the user only has a read access.
prediction_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single prediction
produced by this Model, which are returned via
``PredictResponse.predictions``,
``ExplainResponse.explanations``,
and
``BatchPredictionJob.output_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
explanation_metadata (explain.ExplanationMetadata):
Optional. Metadata describing the Model's input and output for explanation.
Both `explanation_metadata` and `explanation_parameters` must be
passed together when used. For more details, see
`Ref docs <http://tinyurl.com/1igh60kt>`
explanation_parameters (explain.ExplanationParameters):
Optional. Parameters to configure explaining for Model's predictions.
For more details, see `Ref docs <http://tinyurl.com/1an4zake>`
project: Optional[str]=None,
Project to upload this model to. Overrides project set in
aiplatform.init.
location: Optional[str]=None,
Location to upload this model to. Overrides location set in
aiplatform.init.
credentials: Optional[auth_credentials.Credentials]=None,
Custom credentials to use to upload this model. Overrides credentials
set in aiplatform.init.
labels (Dict[str, str]):
Optional. The labels with user-defined metadata to
organize your Models.
Label keys and values can be no longer than 64
characters (Unicode codepoints), can only
contain lowercase letters, numeric characters,
underscores and dashes. International characters
are allowed.
See https://goo.gl/xmQnxf for more information
and examples of labels.
encryption_spec_key_name (Optional[str]):
Optional. The Cloud KMS resource identifier of the customer
managed encryption key used to protect the model. Has the
form:
``projects/my-project/locations/my-region/keyRings/my-kr/cryptoKeys/my-key``.
The key needs to be in the same region as where the compute
resource is created.
If set, this Model and all sub-resources of this Model will be secured by this key.
Overrides encryption_spec_key_name set in aiplatform.init.
staging_bucket (str):
Optional. Bucket to stage local model artifacts. Overrides
staging_bucket set in aiplatform.init.
Returns:
model: Instantiated representation of the uploaded model resource.
Raises:
ValueError: If only `explanation_metadata` or `explanation_parameters`
is specified.
"""
# https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers#xgboost
XGBOOST_SUPPORTED_VERSIONS = ["0.82", "0.90", "1.1", "1.2", "1.3", "1.4"]
XGBOOST_CONTAINER_IMAGE_URI_TEMPLATE = (
Expand Down Expand Up @@ -2547,6 +2661,121 @@ def upload_scikit_learn_model_file(
staging_bucket: Optional[str] = None,
sync=True,
):
"""Uploads a model and returns a Model representing the uploaded Model
resource.
Note: This function is *experimental* and can be changed in the future.
Example usage::
my_model = Model.upload_scikit_learn_model_file(
model_file_path="iris.sklearn_model.joblib"
)
Args:
model_file_path (str): Required. Local file path of the model.
sklearn_version (str):
Optional. The version of the Scikit-learn serving container.
Supported versions: ["0.20", "0.22", "0.23", "0.24"].
If the version is not specified, the latest version is used.
display_name (str):
Optional. The display name of the Model. The name can be up to 128
characters long and can be consist of any UTF-8 characters.
description (str):
The description of the model.
instance_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single instance, which
are used in
``PredictRequest.instances``,
``ExplainRequest.instances``
and
``BatchPredictionJob.input_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
parameters_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the parameters of prediction and
explanation via
``PredictRequest.parameters``,
``ExplainRequest.parameters``
and
``BatchPredictionJob.model_parameters``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform, if no parameters are supported it is set to an
empty string. Note: The URI given on output will be
immutable and probably different, including the URI scheme,
than the one given on input. The output URI will point to a
location where the user only has a read access.
prediction_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single prediction
produced by this Model, which are returned via
``PredictResponse.predictions``,
``ExplainResponse.explanations``,
and
``BatchPredictionJob.output_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
explanation_metadata (explain.ExplanationMetadata):
Optional. Metadata describing the Model's input and output for explanation.
Both `explanation_metadata` and `explanation_parameters` must be
passed together when used. For more details, see
`Ref docs <http://tinyurl.com/1igh60kt>`
explanation_parameters (explain.ExplanationParameters):
Optional. Parameters to configure explaining for Model's predictions.
For more details, see `Ref docs <http://tinyurl.com/1an4zake>`
project: Optional[str]=None,
Project to upload this model to. Overrides project set in
aiplatform.init.
location: Optional[str]=None,
Location to upload this model to. Overrides location set in
aiplatform.init.
credentials: Optional[auth_credentials.Credentials]=None,
Custom credentials to use to upload this model. Overrides credentials
set in aiplatform.init.
labels (Dict[str, str]):
Optional. The labels with user-defined metadata to
organize your Models.
Label keys and values can be no longer than 64
characters (Unicode codepoints), can only
contain lowercase letters, numeric characters,
underscores and dashes. International characters
are allowed.
See https://goo.gl/xmQnxf for more information
and examples of labels.
encryption_spec_key_name (Optional[str]):
Optional. The Cloud KMS resource identifier of the customer
managed encryption key used to protect the model. Has the
form:
``projects/my-project/locations/my-region/keyRings/my-kr/cryptoKeys/my-key``.
The key needs to be in the same region as where the compute
resource is created.
If set, this Model and all sub-resources of this Model will be secured by this key.
Overrides encryption_spec_key_name set in aiplatform.init.
staging_bucket (str):
Optional. Bucket to stage local model artifacts. Overrides
staging_bucket set in aiplatform.init.
Returns:
model: Instantiated representation of the uploaded model resource.
Raises:
ValueError: If only `explanation_metadata` or `explanation_parameters`
is specified.
"""
# https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers#scikit-learn
SKLEARN_SUPPORTED_VERSIONS = ["0.20", "0.22", "0.23", "0.24"]
SKLEARN_CONTAINER_IMAGE_URI_TEMPLATE = (
Expand Down Expand Up @@ -2640,6 +2869,123 @@ def upload_tensorflow_saved_model(
staging_bucket: Optional[str] = None,
sync=True,
):
"""Uploads a model and returns a Model representing the uploaded Model
resource.
Note: This function is *experimental* and can be changed in the future.
Example usage::
my_model = Model.upload_scikit_learn_model_file(
model_file_path="iris.tensorflow_model.SavedModel"
)
Args:
saved_model_dir (str): Required.
Local directory of the Tensorflow SavedModel.
tensorflow_version (str):
Optional. The version of the Tensorflow serving container.
Supported versions: ["0.15", "2.1", "2.2", "2.3", "2.4", "2.5", "2.6"].
If the version is not specified, the latest version is used.
use_gpu (bool): Whether to use GPU for model serving.
display_name (str):
Optional. The display name of the Model. The name can be up to 128
characters long and can be consist of any UTF-8 characters.
description (str):
The description of the model.
instance_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single instance, which
are used in
``PredictRequest.instances``,
``ExplainRequest.instances``
and
``BatchPredictionJob.input_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
parameters_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the parameters of prediction and
explanation via
``PredictRequest.parameters``,
``ExplainRequest.parameters``
and
``BatchPredictionJob.model_parameters``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform, if no parameters are supported it is set to an
empty string. Note: The URI given on output will be
immutable and probably different, including the URI scheme,
than the one given on input. The output URI will point to a
location where the user only has a read access.
prediction_schema_uri (str):
Optional. Points to a YAML file stored on Google Cloud
Storage describing the format of a single prediction
produced by this Model, which are returned via
``PredictResponse.predictions``,
``ExplainResponse.explanations``,
and
``BatchPredictionJob.output_config``.
The schema is defined as an OpenAPI 3.0.2 `Schema
Object <https://tinyurl.com/y538mdwt#schema-object>`__.
AutoML Models always have this field populated by AI
Platform. Note: The URI given on output will be immutable
and probably different, including the URI scheme, than the
one given on input. The output URI will point to a location
where the user only has a read access.
explanation_metadata (explain.ExplanationMetadata):
Optional. Metadata describing the Model's input and output for explanation.
Both `explanation_metadata` and `explanation_parameters` must be
passed together when used. For more details, see
`Ref docs <http://tinyurl.com/1igh60kt>`
explanation_parameters (explain.ExplanationParameters):
Optional. Parameters to configure explaining for Model's predictions.
For more details, see `Ref docs <http://tinyurl.com/1an4zake>`
project: Optional[str]=None,
Project to upload this model to. Overrides project set in
aiplatform.init.
location: Optional[str]=None,
Location to upload this model to. Overrides location set in
aiplatform.init.
credentials: Optional[auth_credentials.Credentials]=None,
Custom credentials to use to upload this model. Overrides credentials
set in aiplatform.init.
labels (Dict[str, str]):
Optional. The labels with user-defined metadata to
organize your Models.
Label keys and values can be no longer than 64
characters (Unicode codepoints), can only
contain lowercase letters, numeric characters,
underscores and dashes. International characters
are allowed.
See https://goo.gl/xmQnxf for more information
and examples of labels.
encryption_spec_key_name (Optional[str]):
Optional. The Cloud KMS resource identifier of the customer
managed encryption key used to protect the model. Has the
form:
``projects/my-project/locations/my-region/keyRings/my-kr/cryptoKeys/my-key``.
The key needs to be in the same region as where the compute
resource is created.
If set, this Model and all sub-resources of this Model will be secured by this key.
Overrides encryption_spec_key_name set in aiplatform.init.
staging_bucket (str):
Optional. Bucket to stage local model artifacts. Overrides
staging_bucket set in aiplatform.init.
Returns:
model: Instantiated representation of the uploaded model resource.
Raises:
ValueError: If only `explanation_metadata` or `explanation_parameters`
is specified.
"""
# https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers#tensorflow
TENSORFLOW_SUPPORTED_VERSIONS = [
"0.15",
Expand Down

0 comments on commit d346c1d

Please sign in to comment.