Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(vertex-ai) convert "example syntax" markdown samples to python #12980

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

Valeriy-Burlaka
Copy link
Member

Description

Convert several Markdown samples to Python. The goal is to preserve the "spirit" of original samples (i.e., to provide a very short snippet that only shows how to operate an API on a very high-level), while making it actually copyable, testable, and working. That's why I even moved the init calls outside of the sample but added all necessary imports for proper sample functioning.

Checklist

Sorry, something went wrong.

@Valeriy-Burlaka Valeriy-Burlaka self-assigned this Dec 10, 2024
@Valeriy-Burlaka Valeriy-Burlaka requested review from a team as code owners December 10, 2024 15:51
@product-auto-label product-auto-label bot added the samples Issues that are directly related to samples. label Dec 10, 2024
vertexai.init(project=PROJECT_ID, location="us-central1")


def generate_response() -> TextEmbedding:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Combines the samples for streaming and non-streaming responses. Adds imports and fixes the syntax, so the sample can be copied and ran, but leaves off all details to keep the sample very concise.

Copy link

snippet-bot bot commented Dec 10, 2024

Here is the summary of changes.

You are about to add 4 region tags.

This comment is generated by snippet-bot.
If you find problems with this result, please file an issue at:
https://github.com/googleapis/repo-automation-bots/issues.
To update this comment, add snippet-bot:force-run label or use the checkbox below:

  • Refresh this comment

Sorry, something went wrong.

@Valeriy-Burlaka Valeriy-Burlaka added kokoro:force-run Add this label to force Kokoro to re-run the tests. api: vertex-ai Issues related to the Vertex AI API. labels Dec 11, 2024
@kokoro-team kokoro-team removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Dec 11, 2024
@Valeriy-Burlaka Valeriy-Burlaka added the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Dec 11, 2024
@kokoro-team kokoro-team removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Dec 11, 2024
@Valeriy-Burlaka
Copy link
Member Author

Valeriy-Burlaka commented Dec 11, 2024

https://btx.cloud.google.com/invocations/27f9454f-a97e-4e74-a5ce-0612115802bd/log :

test_embeddings_examples.py::test_multimodal_embedding_image_video_text PASSED [ 11%]
test_embeddings_examples.py::test_multimodal_embedding_video PASSED      [ 22%]
test_embeddings_examples.py::test_multimodal_embedding_image PASSED      [ 33%]
test_embeddings_examples.py::test_generate_embeddings_with_lower_dimension PASSED [ 44%]
test_embeddings_examples.py::test_create_embeddings PASSED               [ 55%]
test_embeddings_examples.py::test_create_text_embeddings PASSED          [ 66%]
test_embeddings_examples.py::test_text_embed_text PASSED                 [ 77%]
test_embeddings_examples.py::test_code_embed_text PASSED                 [ 88%]
test_embeddings_examples.py::test_tune_embedding_model FAILED            [100%]

=================================== FAILURES ===================================
__________________________ test_tune_embedding_model ___________________________
Traceback (most recent call last):
   
...
# Val: Skipping the pytest/runner stacktrace.
...

    result = testfunction(**testargs)
  File "/workspace/generative_ai/embeddings/test_embeddings_examples.py", line 133, in test_tune_embedding_model
    tuning_job = model_tuning_example.tune_embedding_model(
  File "/workspace/generative_ai/embeddings/model_tuning_example.py", line 42, in tune_embedding_model
    tuning_job = base_model.tune_model(
  File "/workspace/generative_ai/embeddings/.nox/py-3-9/lib/python3.9/site-packages/vertexai/language_models/_language_models.py", line 2344, in tune_model
    return super().tune_model(
  File "/workspace/generative_ai/embeddings/.nox/py-3-9/lib/python3.9/site-packages/vertexai/language_models/_language_models.py", line 367, in tune_model
    return self._tune_model(
  File "/workspace/generative_ai/embeddings/.nox/py-3-9/lib/python3.9/site-packages/vertexai/language_models/_language_models.py", line 422, in _tune_model
    if _is_text_embedding_tuning_pipeline(model_info.tuning_pipeline_uri):
  File "/workspace/generative_ai/embeddings/.nox/py-3-9/lib/python3.9/site-packages/vertexai/language_models/_language_models.py", line 4010, in _is_text_embedding_tuning_pipeline
    return pipeline_uri.startswith(
AttributeError: 'NoneType' object has no attribute 'startswith'
---- generated xml file: /workspace/generative_ai/embeddings/sponge_log.xml ----
=========================== short test summary info ============================
FAILED test_embeddings_examples.py::test_tune_embedding_model - AttributeErro...
========================= 1 failed, 8 passed in 21.42s =========================

The test runs perfectly fine locally. I'm trying to understand why it fails on CI.

Copy link

The CI failure in test_tune_embedding_model seems to stem from an AttributeError: 'NoneType' object has no attribute 'startswith' within the vertexai library. This suggests that a variable expected to be a string (likely pipeline_uri) is instead None. Since the test passes locally, the issue might be related to environment differences between your local setup and the CI environment. Possible causes include:

  • Missing environment variables: Ensure that all necessary environment variables (like GOOGLE_CLOUD_PROJECT) are correctly set in the CI environment. Double-check their values for accuracy.
  • Library version mismatch: Verify that the vertexai library version used in CI is the same as your local version. Inconsistencies can lead to unexpected behavior.
  • Network connectivity: The CI environment might have network restrictions that prevent the code from accessing necessary resources (like the Vertex AI model). Check the CI logs for any network-related errors.
  • Resource availability: Confirm that the required Vertex AI resources (models, pipelines) are properly provisioned and accessible in the project specified by GOOGLE_CLOUD_PROJECT within the CI environment. Check for quota issues or resource limits.
  • Permissions: Ensure that the service account used by the CI system has the necessary permissions to access the Vertex AI resources.

Debugging steps:

  1. Print pipeline_uri: Add a print(pipeline_uri) statement just before the startswith call in _is_text_embedding_tuning_pipeline to see its value in the CI logs. This will pinpoint whether it's actually None and provide a clue about the root cause.
  2. Check CI logs: Carefully examine the complete CI logs for any error messages or warnings that might indicate the problem. Look for clues about environment variables, network issues, or resource limitations.
  3. Simplify the test: Create a minimal, isolated test case that only calls tune_embedding_model with the simplest possible inputs. If this minimal test also fails, it will help isolate the problem to the vertexai library itself or the CI environment.
  4. Isolate the vertexai library: Try running the test locally in a clean virtual environment with only the vertexai library installed to rule out conflicts with other dependencies.

@Valeriy-Burlaka Valeriy-Burlaka added the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Dec 12, 2024
@kokoro-team kokoro-team removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Dec 12, 2024
@Valeriy-Burlaka
Copy link
Member Author

Valeriy-Burlaka commented Dec 13, 2024

I created a new issue to track the embedding test failure on CI. Note, that the failure is not related to this PR — it is reproducible on main and with no code changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the Vertex AI API. samples Issues that are directly related to samples.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants