-
Notifications
You must be signed in to change notification settings - Fork 124
Update new package README template #2707
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
mjwolf
wants to merge
6
commits into
elastic:main
Choose a base branch
from
mjwolf:docs-template
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+332
−484
Open
Changes from all commits
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
9c51732
Update new package README template
mjwolf c2323d2
Only create new files in integration packages
mjwolf 65c93b4
Fix whitespace
mjwolf 70e4bfd
Use new template for the test apache package readme
mjwolf ecf9bed
Merge remote-tracking branch 'upstream/main' into docs-template
mjwolf 89b34ec
Merge remote-tracking branch 'upstream/main' into docs-template
mjwolf File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
112 changes: 60 additions & 52 deletions
112
internal/packages/archetype/_static/package-docs-readme.md.tmpl
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,84 +1,92 @@ | ||
<!-- Use this template language as a starting point, replacing {placeholder text} with details about the integration. --> | ||
<!-- Find more detailed documentation guidelines in https://github.com/elastic/integrations/blob/main/docs/documentation_guidelines.md --> | ||
<!-- This template can be used as a starting point for writing documentation for your new integration. For each section, fill in the details | ||
described in the comments. | ||
|
||
# {{.Manifest.Title}} | ||
Find more detailed documentation guidelines in https://www.elastic.co/docs/extend/integrations/documentation-guidelines | ||
--> | ||
|
||
<!-- The {{.Manifest.Title}} integration allows you to monitor {name of service}. {name of service} is {describe service}. | ||
<!-- Do not remove header --> | ||
{{ `{{header}}` }} | ||
# {{.Manifest.Title}} Integration for Elastic | ||
|
||
Use the {{.Manifest.Title}} integration to {purpose}. Then visualize that data in Kibana, create alerts to notify you if something goes wrong, and reference {data stream type} when troubleshooting an issue. | ||
## Overview | ||
|
||
For example, if you wanted to {sample use case} you could {action}. Then you can {visualize|alert|troubleshoot} by {action}. --> | ||
<!-- Complete this section with a short summary of what data this integration collects and what use cases it enables --> | ||
The {{.Manifest.Title}} integration for Elastic enables collection of ... | ||
This integration facilitates ... | ||
|
||
## Data streams | ||
### Compatibility | ||
|
||
<!-- The {{.Manifest.Title}} integration collects {one|two} type{s} of data streams: {logs and/or metrics}. --> | ||
<!-- Complete this section with information on what 3rd party software or hardware versions this integration is compatible with --> | ||
This integration is compatible with ... | ||
|
||
<!-- If applicable --> | ||
<!-- **Logs** help you keep a record of events happening in {service}. | ||
Log data streams collected by the {name} integration include {sample data stream(s)} and more. See more details in the [Logs](#logs-reference). --> | ||
### How it works | ||
|
||
<!-- If applicable --> | ||
<!-- **Metrics** give you insight into the state of {service}. | ||
Metric data streams collected by the {name} integration include {sample data stream(s)} and more. See more details in the [Metrics](#metrics-reference). --> | ||
<!-- Add a high level overview on how this integration works. For example, does it collect data from API calls or recieving data from a network or file.--> | ||
|
||
<!-- Optional: Any additional notes on data streams --> | ||
## What data does this integration collect? | ||
|
||
## Requirements | ||
<!-- Complete this section with information on what types of data the integration collects, and link to reference documentation if available --> | ||
The {{.Manifest.Title}} integration collects log messages of the following types: | ||
* ... | ||
|
||
You need Elasticsearch for storing and searching your data and Kibana for visualizing and managing it. | ||
You can use our hosted Elasticsearch Service on Elastic Cloud, which is recommended, or self-manage the Elastic Stack on your own hardware. | ||
### Supported use cases | ||
|
||
<!-- | ||
Optional: Other requirements including: | ||
* System compatibility | ||
* Supported versions of third-party products | ||
* Permissions needed | ||
* Anything else that could block a user from successfully using the integration | ||
--> | ||
<!-- Add details on the use cases that can be enabled by using this integration. Explain why a user would want to install and use this integration. --> | ||
|
||
## What do I need to use this integration? | ||
|
||
## Setup | ||
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md). You can install only one Elastic Agent per host. | ||
|
||
<!-- Any prerequisite instructions --> | ||
Elastic Agent is required to stream data from the syslog or log file receiver and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines. | ||
|
||
For step-by-step instructions on how to set up an integration, see the | ||
[Getting started](https://www.elastic.co/guide/en/welcome-to-elastic/current/getting-started-observability.html) guide. | ||
<!-- List any other vendor-specific prerequisites needed before starting to install the integration. --> | ||
|
||
<!-- Additional set up instructions --> | ||
## How do I deploy this integration? | ||
|
||
<!-- If applicable --> | ||
<!-- ## Logs reference --> | ||
### Onboard / configure | ||
|
||
<!-- List the steps that will need to be followed in order to completely set up a working inte completely set up a working integration. | ||
For integrations that support multiple input types, be sure to add steps for all inputs. | ||
--> | ||
|
||
<!-- Repeat for each data stream of the current type --> | ||
<!-- ### {Data stream name} | ||
### Validation | ||
|
||
The `{data stream name}` data stream provides events from {source} of the following types: {list types}. --> | ||
<!-- How can the user test whether the integration is working? Including example commands or test files if applicable --> | ||
|
||
<!-- Optional --> | ||
<!-- #### Example | ||
## Troubleshooting | ||
|
||
For help with Elastic ingest tools, check [Common problems](https://www.elastic.co/docs/troubleshoot/ingest/fleet/common-problems). | ||
|
||
<!-- Add any vendor specific troubleshooting here. | ||
|
||
Are there common issues or “gotchas” for deploying this integration? If so, how can they be resolved? | ||
If applicable, links to the third-party software’s troubleshooting documentation. | ||
--> | ||
|
||
An example event for `{data stream name}` looks as following: | ||
## Scaling | ||
|
||
{code block with example} --> | ||
For more information on architectures that can be used for scaling this integration, check the [Ingest Architectures](https://www.elastic.co/docs/manage-data/ingest/ingest-reference-architectures) documentation. | ||
|
||
<!-- #### Exported fields | ||
<!-- Add any vendor specific scaling information here --> | ||
|
||
{insert table} --> | ||
## Reference | ||
|
||
<!-- If applicable --> | ||
<!-- ## Metrics reference --> | ||
### ECS field Reference | ||
|
||
<!-- Repeat for each data stream of the current type --> | ||
<!-- ### {Data stream name} | ||
{{ `{{fields}}` }} | ||
|
||
The `{data stream name}` data stream provides events from {source} of the following types: {list types}. --> | ||
### Sample Event | ||
|
||
<!-- Optional --> | ||
<!-- #### Example | ||
{{ `{{event}}` }} | ||
|
||
An example event for `{data stream name}` looks as following: | ||
### Inputs used | ||
|
||
{code block with example} --> | ||
<!-- List inputs used in this integration, and link to the documentation --> | ||
These inputs can be used with this integration: | ||
* ... | ||
|
||
<!-- #### Exported fields | ||
### API usage | ||
|
||
{insert table} --> | ||
<!-- For integrations that use APIs to collect data, document all the APIs that are used, and link to relevent information --> | ||
These APIs are used with this integration: | ||
* ... |
3 changes: 3 additions & 0 deletions
3
internal/packages/archetype/_static/package-sample-event.json.tmpl
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
{ | ||
"description": "This is an example sample-event for {{.Manifest.Title}}. Replace it with a real sample event. Hint: If system tests exist, running `elastic-package test system --generate` will generate this file." | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
133 changes: 114 additions & 19 deletions
133
test/packages/parallel/apache/_dev/build/docs/README.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,34 +1,129 @@ | ||
# Apache Integration | ||
{{ `{{header}}` }} | ||
# Apache Integration for Elastic | ||
|
||
This integration periodically fetches metrics from [Apache](https://httpd.apache.org/) servers. It can parse access and error | ||
logs created by the Apache server. | ||
## Overview | ||
|
||
## Compatibility | ||
The Apache integration for Elastic enables collection of access logs, error logs and metrics from [Apache](https://httpd.apache.org/) servers. | ||
This integration facilitates performance monitoring, understanding traffic patterns and gaining security insights from | ||
your Apache servers. | ||
|
||
The Apache datasets were tested with Apache 2.4.12 and 2.4.46 and are expected to work with | ||
all versions >= 2.2.31 and >= 2.4.16 (independent from operating system). | ||
### Compatibility | ||
|
||
## Logs | ||
This integration is compatible with all Apache server versions >= 2.4.16 and >= 2.2.31. | ||
|
||
### Access Logs | ||
### How it works | ||
|
||
Access logs collects the Apache access logs. | ||
Access and error logs data stream read from a logfile. You will configure Apache server to write log files to a location that is readable by elastic-agent, and | ||
elastic-agent will monitor and ingest events written to these files. | ||
|
||
{{fields "access"}} | ||
## What data does this integration collect? | ||
|
||
### Error Logs | ||
The {{.Manifest.Title}} integration collects log messages of the following types: | ||
|
||
Error logs collects the Apache error logs. | ||
The Apache HTTP Server integration collects log messages of the following types: | ||
|
||
{{fields "error"}} | ||
* [Error logs](https://httpd.apache.org/docs/current/logs.html#errorlog) | ||
* [Access logs](https://httpd.apache.org/docs/current/logs.html#accesslog) | ||
* Status metrics | ||
|
||
## Metrics | ||
### Supported use cases | ||
|
||
### Status Metrics | ||
The Apache integration for Elastic allows you to collect, parse, and analyze Apache web server logs and metrics within the Elastic Stack. | ||
It centralizes data like access logs, error logs, and performance metrics, transforming unstructured information into a structured, searchable format. | ||
This enables users to monitor performance, troubleshoot issues, gain security insights, and understand website traffic patterns through powerful visualizations in Kibana. | ||
Ultimately, it simplifies the management and analysis of critical data for maintaining a healthy and secure web infrastructure. | ||
|
||
The server status stream collects data from the Apache Status module. It scrapes the status data from the web page | ||
generated by the `mod_status` module. | ||
## What do I need to use this integration? | ||
|
||
{{event "status"}} | ||
Elastic Agent must be installed. For more details, check the Elastic Agent [installation instructions](docs-content://reference/fleet/install-elastic-agents.md). You can install only one Elastic Agent per host. | ||
|
||
{{fields "status"}} | ||
Elastic Agent is required to stream data from the syslog or log file receiver and ship the data to Elastic, where the events will then be processed via the integration's ingest pipelines. | ||
|
||
|
||
## How do I deploy this integration? | ||
|
||
### Onboard / configure | ||
|
||
#### Collect access and error logs | ||
|
||
Follow [Apache server instructions](https://httpd.apache.org/docs/2.4/logs.html) to write error or access logs to a location readable by elastic-agent. | ||
Elastic-agent can run on the same system as your Apache server, or the logs can be forwarded to a different system running elastic-agent. | ||
|
||
#### Collect metrics | ||
|
||
Follow the [Apache server instructions](https://httpd.apache.org/docs/2.4/mod/mod_status.html) to enable the Status module. | ||
|
||
#### Enable the integration in Elastic | ||
|
||
1. In Kibana navigate to **Management** > **Integrations**. | ||
2. In the search bar, type **Apache HTTP Server**. | ||
3. Select the **Apache HTTP Server** integration and add it. | ||
4. If needed, install Elastic Agent on the systems which will receive error or access log files. | ||
5. Enable and configure only the collection methods which you will use. | ||
|
||
* **To collect logs from Apache instances**, you'll need to add log file path patterns elastic-agent will monitor. | ||
|
||
* **To collect metrics**, you'll need to configure the Apache hosts which will be monitored. | ||
|
||
6. Press **Save Integration** to begin collecting logs. | ||
|
||
#### Anomaly Detection Configurations | ||
|
||
The Apache HTTP server integration also has support of anomaly detection jobs. | ||
|
||
These anomaly detection jobs are available in the Machine Learning app in Kibana | ||
when you have data that matches the query specified in the | ||
[manifest](https://github.com/elastic/integrations/blob/main/packages/apache/kibana/ml_module/apache-Logs-ml.json#L11). | ||
|
||
##### Apache Access Logs | ||
|
||
Find unusual activity in HTTP access logs. | ||
|
||
| Job | Description | | ||
|---|---| | ||
| visitor_rate_apache | HTTP Access Logs: Detect unusual visitor rates | | ||
| status_code_rate_apache | HTTP Access Logs: Detect unusual status code rates | | ||
| source_ip_url_count_apache | HTTP Access Logs: Detect unusual source IPs - high distinct count of URLs | | ||
| source_ip_request_rate_apache | HTTP Access Logs: Detect unusual source IPs - high request rates | | ||
| low_request_rate_apache | HTTP Access Logs: Detect low request rates | | ||
|
||
### Validation | ||
|
||
<!-- How can the user test whether the integration is working? Including example commands or test files if applicable --> | ||
The "[Logs Apache] Access and error logs" and "[Metrics Apache] Overview" dashboards will show activity for your Apache servers. | ||
After the integration is installed, view these dashboards in Kibana and verify that information for servers is shown. | ||
|
||
## Troubleshooting | ||
|
||
For help with Elastic ingest tools, check [Common problems](https://www.elastic.co/docs/troubleshoot/ingest/fleet/common-problems). | ||
|
||
<!-- Add any vendor specific troubleshooting here. | ||
|
||
Are there common issues or “gotchas” for deploying this integration? If so, how can they be resolved? | ||
If applicable, links to the third-party software’s troubleshooting documentation. | ||
--> | ||
|
||
## Scaling | ||
|
||
For more information on architectures that can be used for scaling this integration, check the [Ingest Architectures](https://www.elastic.co/docs/manage-data/ingest/ingest-reference-architectures) documentation. | ||
|
||
## Reference | ||
|
||
### ECS field Reference | ||
|
||
{{ `{{fields}}` }} | ||
|
||
### Sample Event | ||
|
||
{{ `{{event}}` }} | ||
|
||
### Inputs used | ||
|
||
These inputs can be used with this integration: | ||
* [logfile](https://www.elastic.co/docs/reference/integrations/filestream) | ||
* [apache/metrics](https://www.elastic.co/docs/reference/beats/metricbeat/metricbeat-metricset-apache-status) | ||
|
||
### API usage | ||
|
||
These APIs are used with this integration: | ||
* Metrics are collected using the [mod_status](https://httpd.apache.org/docs/current/mod/mod_status.html) Apache module. |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This depends on the user leaving ``{{header}}` in the readme in order to add the "Do not modify" string to the generated files.
I could also inject the string into all generated READMEs, regardless of if this {{header}} is in it, but I don't know if this would cause problems with other package types, so I didn't do that. If it won't cause problems, I could change to add to all generated files.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I definitely have no idea if there are any negative consequences, but I'd vote for the string being generated into all generated files without having to worry about including the header placeholder