Skip to content

Commit 689c1cc

Browse files
authored
Merge branch 'main' into tnorth/fix
2 parents f0db33e + 2d42884 commit 689c1cc

File tree

70 files changed

+26419
-3435
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

70 files changed

+26419
-3435
lines changed

CHANGELOG.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,38 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
1111
1212
## Unreleased
1313

14+
### Added
15+
16+
- `opentelemetry-instrumentation-aiohttp-client`: add support for url exclusions via `OTEL_PYTHON_EXCLUDED_URLS` / `OTEL_PYTHON_AIOHTTP_CLIENT_EXCLUDED_URLS`
17+
([#3850](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3850))
18+
- `opentelemetry-instrumentation-httpx`: add support for url exclusions via `OTEL_PYTHON_EXCLUDED_URLS` / `OTEL_PYTHON_HTTPX_EXCLUDED_URLS`
19+
([#3837](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3837))
20+
- `opentelemetry-instrumentation-flask`: improve readthedocs for sqlcommenter configuration.
21+
([#3883](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3883))
22+
- `opentelemetry-instrumentation-sqlalchemy`: improve readthedocs for sqlcommenter configuration.
23+
([#3886](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3886))
24+
- `opentelemetry-instrumentation-mysql`, `opentelemetry-instrumentation-mysqlclient`, `opentelemetry-instrumentation-pymysql`: improve readthedocs for sqlcommenter configuration.
25+
([#3885](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3885))
26+
- `opentelemetry-instrumentation-django`: improve readthedocs for sqlcommenter configuration.
27+
([#3884](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3884))
28+
- `opentelemetry-instrumentation-aiohttp-server`: add support for custom header captures via `OTEL_INSTRUMENTATION_HTTP_CAPTURE_HEADERS_SERVER_REQUEST` and `OTEL_INSTRUMENTATION_HTTP_CAPTURE_HEADERS_SERVER_RESPONSE`
29+
([#3916](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3916))
30+
31+
### Fixed
32+
33+
- `opentelemetry-instrumentation-botocore`: bedrock: only decode JSON input buffer in Anthropic Claude streaming
34+
([#3875](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3875))
35+
- `opentelemetry-instrumentation-aiohttp-client`, `opentelemetry-instrumentation-aiohttp-server`: Fix readme links and text
36+
([#3902](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3902))
37+
- `opentelemetry-instrumentation-aws-lambda`: Fix ImportError with slash-delimited handler paths
38+
([#3894](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3894))
39+
- `opentelemetry-exporter-richconsole`: Prevent deadlock when parent span is not part of the batch
40+
([#3900](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3900))
41+
- `opentelemetry-instrumentation-psycopg2`, `opentelemetry-instrumentation-psycopg`: improve readthedocs for sqlcommenter configuration.
42+
([#3882](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3882))
43+
- `opentelemetry-instrumentation-aiohttp-server`: delay initialization of tracer, meter and excluded urls to instrumentation for testability
44+
([#3836](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3836))
45+
1446
## Version 1.38.0/0.59b0 (2025-10-16)
1547

1648
### Fixed

RELEASING.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ to pick a specific package to release. It follows the same versioning strategy a
3737
Long-term package release branch follows `package-release/{package-name}/v{major}.{minor}.x` (or `package-release/{package-name}/v{major}.{minor}bx`) naming pattern.
3838

3939
The workflow will create two pull requests, one against the `main` and one against the `package-release/` branch; both should be merged in order to proceed with the release.
40+
To keep the process lightweight, it's OK to approve the PRs you generate and merge without additional reviews.
4041

4142
## Preparing a new patch release
4243

exporter/opentelemetry-exporter-richconsole/src/opentelemetry/exporter/richconsole/__init__.py

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -173,14 +173,17 @@ def spans_to_tree(spans: typing.Sequence[ReadableSpan]) -> Dict[str, Tree]:
173173
trees = {}
174174
parents = {}
175175
spans = list(spans)
176+
span_ids = {s.context.span_id for s in spans}
176177
while spans:
177178
for span in spans:
178-
if not span.parent:
179+
if not span.parent or span.parent.span_id not in span_ids:
179180
trace_id = opentelemetry.trace.format_trace_id(
180181
span.context.trace_id
181182
)
182-
trees[trace_id] = Tree(label=f"Trace {trace_id}")
183-
child = trees[trace_id].add(
183+
tree = trees.setdefault(
184+
trace_id, Tree(label=f"Trace {trace_id}")
185+
)
186+
child = tree.add(
184187
label=Text.from_markup(
185188
f"[blue][{_ns_to_time(span.start_time)}][/blue] [bold]{span.name}[/bold], span {opentelemetry.trace.format_span_id(span.context.span_id)}"
186189
)

exporter/opentelemetry-exporter-richconsole/test-requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ pluggy==1.5.0
99
py-cpuinfo==9.0.0
1010
Pygments==2.17.2
1111
pytest==7.4.4
12+
pytest-timeout==2.3.1
1213
rich==13.7.1
1314
tomli==2.0.1
1415
typing_extensions==4.12.2

exporter/opentelemetry-exporter-richconsole/tests/test_rich_exporter.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -96,3 +96,15 @@ def test_multiple_traces(tracer_provider):
9696
parent_2.name in child.label
9797
for child in trees[traceid_1].children[0].children
9898
)
99+
100+
101+
@pytest.mark.timeout(30)
102+
def test_no_deadlock(tracer_provider):
103+
# non-regression test for https://github.com/open-telemetry/opentelemetry-python-contrib/issues/3254
104+
105+
tracer = tracer_provider.get_tracer(__name__)
106+
with tracer.start_as_current_span("parent"):
107+
with tracer.start_as_current_span("child") as child:
108+
pass
109+
110+
RichConsoleSpanExporter.spans_to_tree((child,))

instrumentation-genai/opentelemetry-instrumentation-google-genai/CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## Unreleased
99

10+
- Minor change to check LRU cache in Completion Hook before acquiring semaphore/thread ([#3907](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3907)).
11+
1012
## Version 0.4b0 (2025-10-16)
1113

1214
- Implement the new semantic convention changes made in https://github.com/open-telemetry/semantic-conventions/pull/2179.

instrumentation-genai/opentelemetry-instrumentation-langchain/examples/manual/README.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
OpenTelemetry Langcahin Instrumentation Example
1+
OpenTelemetry Langchain Instrumentation Example
22
===============================================
33

44
This is an example of how to instrument Langchain when configuring OpenTelemetry SDK and instrumentations manually.
@@ -8,14 +8,14 @@ Traces include details such as the span name and other attributes.
88

99
Note: `.env <.env>`_ file configures additional environment variables:
1010
- :code:`OTEL_LOGS_EXPORTER=otlp` to specify exporter type.
11-
- :code:`OPENAI_API_KEY` open AI key for accessing the OpenAI API.
11+
- :code:`OPENAI_API_KEY` key for accessing the OpenAI API.
1212
- :code:`OTEL_EXPORTER_OTLP_ENDPOINT` to specify the endpoint for exporting traces (default is http://localhost:4317).
1313

1414
Setup
1515
-----
1616

1717
Minimally, update the `.env <.env>`_ file with your :code:`OPENAI_API_KEY`.
18-
An OTLP compatible endpoint should be listening for traces http://localhost:4317.
18+
An OTLP compatible endpoint should be listening for traces at http://localhost:4317.
1919
If not, update :code:`OTEL_EXPORTER_OTLP_ENDPOINT` as well.
2020

2121
Next, set up a virtual environment like this:

instrumentation-genai/opentelemetry-instrumentation-openai-v2/CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## Unreleased
99

10+
- Added support for OpenAI embeddings instrumentation
11+
([#3461](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3461))
1012
- Record prompt and completion events regardless of span sampling decision.
1113
([#3226](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3226))
1214
- Migrate off the deprecated events API to use the logs API

instrumentation-genai/opentelemetry-instrumentation-openai-v2/README.rst

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ Check out the `manual example <examples/manual>`_ for more details.
5656
Instrumenting all clients
5757
*************************
5858

59-
When using the instrumentor, all clients will automatically trace OpenAI chat completion operations.
59+
When using the instrumentor, all clients will automatically trace OpenAI operations including chat completions and embeddings.
6060
You can also optionally capture prompts and completions as log events.
6161

6262
Make sure to configure OpenTelemetry tracing, logging, and events to capture all telemetry emitted by the instrumentation.
@@ -68,12 +68,19 @@ Make sure to configure OpenTelemetry tracing, logging, and events to capture all
6868
OpenAIInstrumentor().instrument()
6969
7070
client = OpenAI()
71+
# Chat completion example
7172
response = client.chat.completions.create(
7273
model="gpt-4o-mini",
7374
messages=[
7475
{"role": "user", "content": "Write a short poem on open telemetry."},
7576
],
7677
)
78+
79+
# Embeddings example
80+
embedding_response = client.embeddings.create(
81+
model="text-embedding-3-small",
82+
input="Generate vector embeddings for this text"
83+
)
7784
7885
Enabling message content
7986
*************************
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# Update this with your real OpenAI API key
2+
OPENAI_API_KEY=sk-YOUR_API_KEY
3+
4+
# Uncomment to use Ollama instead of OpenAI
5+
# OPENAI_BASE_URL=http://localhost:11434/v1
6+
# OPENAI_API_KEY=unused
7+
# CHAT_MODEL=qwen2.5:0.5b
8+
9+
# Uncomment and change to your OTLP endpoint
10+
# OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
11+
# OTEL_EXPORTER_OTLP_PROTOCOL=grpc
12+
13+
OTEL_SERVICE_NAME=opentelemetry-python-openai
14+
15+
# Change to 'false' to disable collection of python logging logs
16+
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
17+
18+
# Uncomment if your OTLP endpoint doesn't support logs
19+
# OTEL_LOGS_EXPORTER=console
20+
21+
# Change to 'false' to hide prompt and completion content
22+
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true

0 commit comments

Comments
 (0)