-
Notifications
You must be signed in to change notification settings - Fork 17
Open
Labels
api: alloydbIssues related to the googleapis/langchain-google-alloydb-pg-python API.Issues related to the googleapis/langchain-google-alloydb-pg-python API.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.
Description
Environment details
- OS type and version: Colab Enterprise
- Python version: py-310 (from reasoning-engine image)
- pip version:
pip --version
Unsure langchain-google-alloydb-pg
version:pip show langchain-google-alloydb-pg
- both langchain-google-alloydb-pg-0.8.0 and langchain-google-alloydb-pg-0.7.0
Steps to reproduce
- Try running Deploying a RAG Application with AlloyDB to LangChain on Vertex AI
- Use suggested IAM-based authentication in
similarity_search
- Get ValueError
Code example
Directly from notebook
def similarity_search(query: str) -> list[Document]:
engine = AlloyDBEngine.from_instance(
PROJECT_ID,
REGION,
CLUSTER,
INSTANCE,
DATABASE,
# Uncomment to use built-in authentication instead of IAM authentication
# user="postgres",
# password=PASSWORD,
)
)
vector_store = AlloyDBVectorStore.create_sync(
engine,
table_name=TABLE_NAME,
embedding_service=VertexAIEmbeddings(
model_name="textembedding-gecko@latest", project=PROJECT_ID
),
)
retriever = vector_store.as_retriever()
return retriever.invoke(query)
vertexai.init(project=PROJECT_ID, location="us-central1", staging_bucket=STAGING_BUCKET)
remote_app = reasoning_engines.ReasoningEngine.create(
reasoning_engines.LangchainAgent(
model="gemini-pro",
tools=[similarity_search],
model_kwargs={
"temperature": 0.1,
},
),
requirements=[
"google-cloud-aiplatform[reasoningengine,langchain]",
"langchain-google-alloydb-pg",
"langchain-google-vertexai",
],
display_name="PrebuiltAgent",
)
response = remote_app.query(input="Find movies about engineers")
print(response["output"])
Stack trace
...
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/usr/local/lib/python3.10/site-packages/langchain_google_alloydb_pg/async_vectorstore.py", line 168, in create
raise ValueError(f"Id column, {id_column}, does not exist.")
ValueError: Id column, langchain_id, does not exist.
Debugging
Example schema from movies.csv
uses id
, wasn't sure if I should have been renaming this (it wasn't it).
But after finding the code in this package, I saw that it could be a case of the columns
var in async_vectorstore.py
's create()
(via AlloyDBVectorStore.create_sync
) is empty, hence the error.
If I changed the authentication from IAM built-in to the username password, the functionality worked.
Possible fix: before the # Check columns
maybe check if there are any columns, and return an authentication error. This would at least present the actual issue rather than a schema issue.
b/356000377
chck and dashavol
Metadata
Metadata
Assignees
Labels
api: alloydbIssues related to the googleapis/langchain-google-alloydb-pg-python API.Issues related to the googleapis/langchain-google-alloydb-pg-python API.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.