注册并提供 OSS 嵌入模型

此笔记本在可用于矢量搜索的模型服务终结点中设置开放源代码文本嵌入模型e5-small-v2

  • 从 Hugging Face Hub 下载模型。
  • 将其注册到 MLflow 模型注册表。
  • 启动模型服务终结点来为模型提供服务。

e5-small-v2 模型可用于 https://huggingface.co/intfloat/e5-small-v2.

有关 Databricks Runtime 中包含的库版本的列表,请参阅 Databricks Runtime 版本的 发行说明

安装 Databricks Python SDK

此笔记本使用其Python客户端来处理服务终结点。

%pip install -U databricks-sdk python-snappy
%pip install sentence-transformers
dbutils.library.restartPython()

下载模型

# Download model using the sentence_transformers library.
from sentence_transformers import SentenceTransformer

source_model_name = 'intfloat/e5-small-v2'  # model name on Hugging Face Hub
model = SentenceTransformer(source_model_name)
# Test the model, just to show it works.
sentences = ["This is an example sentence", "Each sentence is converted"]
embeddings = model.encode(sentences)
print(embeddings)

将模型注册到 MLflow

import mlflow
mlflow.set_registry_uri("databricks-uc")

# Specify the catalog and schema to use. You must have USE_CATALOG privilege on the catalog and USE_SCHEMA and CREATE_TABLE privileges on the schema.
# Change the catalog and schema here if necessary.
catalog = "main"
schema = "default"
model_name = "e5-small-v2"
# MLflow model name. The Model Registry uses this name for the model.
registered_model_name = f"{catalog}.{schema}.{model_name}"
# Compute input and output schema.
signature = mlflow.models.signature.infer_signature(sentences, embeddings)
print(signature)
model_info = mlflow.sentence_transformers.log_model(
  model,
  artifact_path="model",
  signature=signature,
  input_example=sentences,
  registered_model_name=registered_model_name)
inference_test = ["I enjoy pies of both apple and cherry.", "I prefer cookies."]

# Load the custom model by providing the URI for where the model was logged.
loaded_model_pyfunc = mlflow.pyfunc.load_model(model_info.model_uri)

# Perform a quick test to ensure that the loaded model generates the correct output.
embeddings_test = loaded_model_pyfunc.predict(inference_test)
embeddings_test
# Extract the version of the model you just registered.
mlflow_client = mlflow.MlflowClient()

def get_latest_model_version(model_name):
  client = mlflow_client
  model_version_infos = client.search_model_versions("name = '%s'" % model_name)
  return max([int(model_version_info.version) for model_version_info in model_version_infos])

model_version = get_latest_model_version(registered_model_name)
model_version

创建模型服务接口端点

有关更多详细信息,请参阅 创建基础模型服务终结点

注意:此示例创建一个缩小到 0 的小型 CPU 终结点。 这是用于快速、小型测试。 对于更为真实的使用案例,请考虑使用 GPU 终结点以更快地进行嵌入计算,并确保在频繁查询的情况下不要缩减至零状态,因为模型服务终结点存在一定的冷启动延迟成本。

endpoint_name = "e5-small-v2"  # Name of endpoint to create
from databricks.sdk import WorkspaceClient
from databricks.sdk.service.serving import EndpointCoreConfigInput

w = WorkspaceClient()
endpoint_config_dict = {
    "served_entities": [
        {
            "name": f'{registered_model_name.replace(".", "_")}_{1}',
            "entity_name": registered_model_name,
            "entity_version": model_version,
            "workload_type": "CPU",
            "workload_size": "Small",
            "scale_to_zero_enabled": True,
        }
    ]
}

endpoint_config = EndpointCoreConfigInput.from_dict(endpoint_config_dict)

# The endpoint may take several minutes to get ready.
w.serving_endpoints.create_and_wait(name=endpoint_name, config=endpoint_config)

查询端点

上述 create_and_wait 命令会等待终结点准备就绪。 还可以在 Databricks UI 中检查服务终结点的状态。

有关详细信息,请参阅 查询基础模型

# Only run this command after the Model Serving endpoint is in the Ready state.
import time

start = time.time()

# If the endpoint is not yet ready, you might get a timeout error. If so, wait and then rerun the command.
endpoint_response = w.serving_endpoints.query(name=endpoint_name, dataframe_records=['Hello world', 'Good morning'])

end = time.time()

print(endpoint_response)
print(f'Time taken for querying endpoint in seconds: {end-start}')

示例笔记本

注册并提供 OSS 嵌入模型

获取笔记本