Skip to content

Commit eb6a2e1

Browse files
peter-quixPeter Nagy
authored andcommitted
Update all connector references to use current paths
Replace all instances of ../connectors/index.md with the current path ../quix-connectors/templates/index.md across 15 documentation files. This avoids unnecessary redirects and ensures consistent linking.
1 parent d4b59ba commit eb6a2e1

File tree

15 files changed

+18
-18
lines changed

15 files changed

+18
-18
lines changed

docs/deploy/secrets-management.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Secrets management
22

3-
Sometimes you connect the [Quix Connectors](../connectors/index.md), or services you have created, to other services, such as AWS, Vonage, Twilio, Azure and so on. You usually need to provide credentials to access these third-party APIs and services, using environment variables.
3+
Sometimes you connect the [Quix Connectors](../quix-connectors/templates/index.md), or services you have created, to other services, such as AWS, Vonage, Twilio, Azure and so on. You usually need to provide credentials to access these third-party APIs and services, using environment variables.
44

55
You do not want to expose these credentials through the use of environment variables in your YAML code, service code, Git repository, or even the UI, which may have shared access. Quix provides a feature to enable your credentials to be stored securely - secrets management.
66

docs/develop/integrate-data/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ To publish data:
2424

2525
The particular method you use depends on the nature of the service you're trying to interface with Quix. Each of these methods is described in the following sections.
2626

27-
There are various ways to connect to Quix, and how you do so depends on the nature of the service and data you are connecting. In many cases Quix has a [suitable connector](../../connectors/index.md) you can use with minor configuration.
27+
There are various ways to connect to Quix, and how you do so depends on the nature of the service and data you are connecting. In many cases Quix has a [suitable connector](../../quix-connectors/templates/index.md) you can use with minor configuration.
2828

2929
If you want some example code you can use as a starting point for connecting your own data, you can use the `External source` and `External destination` samples. Or use one of the existing connectors as a starting point, such as the `Starter Source`, or `Starter Destination`.
3030

docs/develop/integrate-data/prebuilt-connector-destination.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
This is the easiest method, as no code needs to be written, and there is usually only minor configuration required to get a Quix connector up and running.
44

5-
You can review the list of connectors in the [connector documentation](../../connectors/index.md). The code for our connectors can be found in the [Quix Code Samples GitHub repository](https://github.com/quixio/quix-samples){target=_blank}.
5+
You can review the list of connectors in the [connector documentation](../../quix-connectors/templates/index.md). The code for our connectors can be found in the [Quix Code Samples GitHub repository](https://github.com/quixio/quix-samples){target=_blank}.
66

77
Note there are two main types of connector:
88

docs/develop/integrate-data/prebuilt-connector.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
This is the easiest method, as no code needs to be written, and there is usually only minor configuration required to get a Quix connector up and running.
44

5-
You can review the list of connectors in the [connector documentation](../../connectors/index.md). The code for our connectors can be found in the [Quix Code Samples GitHub repository](https://github.com/quixio/quix-samples){target=_blank}.
5+
You can review the list of connectors in the [connector documentation](../../quix-connectors/templates/index.md). The code for our connectors can be found in the [Quix Code Samples GitHub repository](https://github.com/quixio/quix-samples){target=_blank}.
66

77
Note there are two main types of connector:
88

docs/get-started/stream-processing-pipelines.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,4 @@ The applications (services) are connected in the pipeline by topics. The service
1313

1414
![Example pipeline](../images/example-pipeline.png)
1515

16-
[Read more about connectors](../connectors/index.md).
16+
[Read more about connectors](../quix-connectors/templates/index.md).

docs/integrations/databases/influxdb/migrating-v2-v3.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
If you have data in a v2 InfluxDB database, and you want to migrate it to InfluxDB v3, then Quix can help.
44

5-
Quix provides the following InfluxDB [connectors](../../../connectors/index.md):
5+
Quix provides the following InfluxDB [connectors](../../../quix-connectors/templates/index.md):
66

77
* InfluxDB v2 source
88
* InfluxDB v3 source

docs/integrations/databases/influxdb/quickstart.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ search:
77

88
# Quickstart
99

10-
This quickstart shows you how to integrate Quix with InfluxDB using our standard [connectors](../../../connectors/index.md).
10+
This quickstart shows you how to integrate Quix with InfluxDB using our standard [connectors](../../../quix-connectors/templates/index.md).
1111

1212
In the first part of this quickstart, you'll read F1 car telemetry data, transform it, and then publish it to InfluxDB.
1313

docs/integrations/databases/influxdb/replacing-kapacitor.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ The following illustrates a typical processing pipeline running in Quix.
2929
| Scalability | Kapacitor is designed to be horizontally scalable, enabling it to handle large volumes of data and scale alongside the rest of the TICK Stack components. | Quix was designed to be both vertically and horizontally scalable. It is based on [Kafka](../../../kb/what-is-kafka.md), using either a Quix-hosted broker, or an externally hosted broker. This means all the horizontal scaling features of Kafka, such as consumer groups, is built into Quix. Quix also enables you to configure the number of replicas, RAM, and CPU resources allocated on a per-service (deployment) basis, for accurate vertical scaling. |
3030
| High availability | Kapacitor supports high availability setups to ensure uninterrupted data processing and alerting even in the case of node failures. | As Quix uses a Kafka broker (including Kafka-compatible brokers such as Redpanda), it has all the high availability features inherent in a Kafka-based solution. In addition, Quix uses a Kubernetes cluster to seamlessly distribute and manage containers that execute your service's Python code. |
3131
| Replay and backfilling | Kapacitor enables users to replay historical data or backfill missing data, enabling them to analyze past events or ensure data consistency. | Quix leverages Kafka's retention capabilities for data backfilling and analysis. You can process historical data stored in Kafka topics using standard Kafka consumer patterns. This is useful for testing and evaluating processing pipelines, and examining historical data. This is also enhanced by the ability to connect to external tools using Quix connectors. |
32-
| Extensibility | Kapacitor provides an extensible architecture, enabling users to develop and integrate custom functions, connectors, and integrations as per their specific requirements. | Quix is fully extensible using Python. Complex stream processing pipelines can be built out one service at a time, and then deployed with a single click. It is also possible to use a wide range of standard [connectors](../../../connectors/index.md) to connect to a range of third-party services. Powerful [integrations](../../overview.md) extend these capabilities. In addition, [REST and real-time APIs](../../../develop/apis-overview.md) are available for use with any language that supports REST or WebSockets. As Quix is designed around standard Git development workflows, it enables developers to collaborate on projects. |
32+
| Extensibility | Kapacitor provides an extensible architecture, enabling users to develop and integrate custom functions, connectors, and integrations as per their specific requirements. | Quix is fully extensible using Python. Complex stream processing pipelines can be built out one service at a time, and then deployed with a single click. It is also possible to use a wide range of standard [connectors](../../../quix-connectors/templates/index.md) to connect to a range of third-party services. Powerful [integrations](../../overview.md) extend these capabilities. In addition, [REST and real-time APIs](../../../develop/apis-overview.md) are available for use with any language that supports REST or WebSockets. As Quix is designed around standard Git development workflows, it enables developers to collaborate on projects. |
3333

3434
## See also
3535

docs/integrations/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,4 +15,4 @@ This section of the documentation provides more detailed information on integrat
1515
| Upstash | Kafka broker | [Guide](./brokers/upstash.md) |
1616
| InfluxDB | Time series database | [Overview](./databases/influxdb/overview.md) |
1717

18-
See also the [Quix connectors](../connectors/index.md).
18+
See also the [Quix connectors](../quix-connectors/templates/index.md).

docs/kb/what-is-kafka.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,6 @@ Kafka is extensively used in stream processing due to its ability to handle real
4141

4242
* **Scalability**: Stream processing applications built with Kafka can scale horizontally by adding more instances of processing nodes. Kafka handles the distribution of data and load balancing across these instances, ensuring scalability without downtime.
4343

44-
* **Integration with external systems**: Kafka integrates seamlessly with external systems, enabling stream processing applications to interact with various data sinks and sources. For example, processed data can be stored in databases, sent to analytics platforms, or used to trigger downstream actions. Quix features a wide variety of [connectors](../connectors/index.md) and [integrations](../integrations/overview.md) to enable this.
44+
* **Integration with external systems**: Kafka integrates seamlessly with external systems, enabling stream processing applications to interact with various data sinks and sources. For example, processed data can be stored in databases, sent to analytics platforms, or used to trigger downstream actions. Quix features a wide variety of [connectors](../quix-connectors/templates/index.md) and [integrations](../integrations/overview.md) to enable this.
4545

4646
Overall, Quix's integration with the Kafka provides a powerful framework for building scalable, fault-tolerant, and real-time stream processing applications, making it a popular choice in the streaming data ecosystem.

0 commit comments

Comments
 (0)