Skip to content

Commit

Permalink
docs: revise URL structure (libraries) (#1773)
Browse files Browse the repository at this point in the history
  • Loading branch information
ennru authored Oct 10, 2024
1 parent a05d6f8 commit b54a6df
Show file tree
Hide file tree
Showing 33 changed files with 80 additions and 82 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/link-validator.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ on:
workflow_dispatch:
pull_request:
schedule:
- cron: '0 6 * * 1'
- cron: '10 6 1 * *'

permissions:
contents: read # allow actions/checkout
Expand Down Expand Up @@ -43,4 +43,4 @@ jobs:
run: sbt docs/makeSite

- name: Run Link Validator
run: cs launch net.runne::site-link-validator:0.2.3 -- scripts/link-validator.conf
run: cs launch net.runne::site-link-validator:0.2.5 -- scripts/link-validator.conf
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Alpakka Kafka [![gh-actions-badge][]][gh-actions]

Systems don't come alone. In the modern world of microservices and cloud deployment, new components must interact with legacy systems, making integration an important key to success. Reactive Streams give us a technology-independent tool to let these heterogeneous systems communicate without overwhelming each other.

The Alpakka project is an open source initiative to implement stream-aware, reactive, integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/docs/akka/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.
The Alpakka project is an open source initiative to implement stream-aware, reactive, integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/libraries/akka-core/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/libraries/akka-core/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.

This repository contains the sources for the **Alpakka Kafka connector**. Which lets you connect [Apache Kafka](https://kafka.apache.org/) to Akka Streams. It was formerly known as **Akka Streams Kafka** and even **Reactive Kafka**.

Expand All @@ -17,9 +17,9 @@ Akka Stream connectors to other technologies are listed in the [Alpakka reposito
Documentation
-------------

- [Alpakka reference](https://doc.akka.io/docs/alpakka/current/) documentation
- [Alpakka reference](https://doc.akka.io/libraries/alpakka/current/) documentation

- **[Alpakka Kafka connector reference](https://doc.akka.io/docs/akka-stream-kafka/current/) documentation**
- **[Alpakka Kafka connector reference](https://doc.akka.io/libraries/alpakka-kafka/current/) documentation**

To keep up with the latest Alpakka releases check out [Alpakka releases](https://github.com/akka/alpakka/releases) and [Alpakka Kafka releases](https://github.com/akka/alpakka-kafka/releases).

Expand Down Expand Up @@ -67,4 +67,4 @@ License

Akka is licensed under the Business Source License 1.1, please see the [Akka License FAQ](https://www.lightbend.com/akka/license-faq).

Tests and documentation are under a separate license, see the LICENSE file in each documentation and test root directory for details.
Tests and documentation are under a separate license, see the LICENSE file in each documentation and test root directory for details.
26 changes: 11 additions & 15 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ val commonSettings = Def.settings(
organization := "com.typesafe.akka",
organizationName := "Lightbend Inc.",
organizationHomepage := Some(url("https://www.lightbend.com/")),
homepage := Some(url("https://doc.akka.io/docs/alpakka-kafka/current")),
homepage := Some(url("https://doc.akka.io/libraries/alpakka-kafka/current")),
scmInfo := Some(ScmInfo(url("https://github.com/akka/alpakka-kafka"), "[email protected]:akka/alpakka-kafka.git")),
developers += Developer("contributors",
"Contributors",
Expand Down Expand Up @@ -129,19 +129,15 @@ val commonSettings = Def.settings(
"https://doc.akka.io/api/alpakka-kafka/current/"
) ++ {
if (scalaBinaryVersion.value.startsWith("3")) {
Seq("-skip-packages:akka.pattern") // different usage in scala3
Seq(s"-external-mappings:https://docs.oracle.com/en/java/javase/${JavaDocLinkVersion}/docs/api/java.base/",
"-skip-packages:akka.pattern")
} else {
Seq("-skip-packages", "akka.pattern") // for some reason Scaladoc creates this
Seq("-jdk-api-doc-base",
s"https://docs.oracle.com/en/java/javase/${JavaDocLinkVersion}/docs/api/java.base/",
"-skip-packages",
"akka.pattern")
}
},
// make use of https://github.com/scala/scala/pull/8663
Compile / doc / scalacOptions ++= {
if (scalaBinaryVersion.value.startsWith("3")) {
Seq(s"-external-mappings:https://docs.oracle.com/en/java/javase/${JavaDocLinkVersion}/docs/api/java.base/") // different usage in scala3
} else if (scalaBinaryVersion.value.startsWith("2.13")) {
Seq("-jdk-api-doc-base", s"https://docs.oracle.com/en/java/javase/${JavaDocLinkVersion}/docs/api/java.base/")
} else Nil
},
Compile / doc / scalacOptions -= "-Xfatal-warnings",
// show full stack traces and test case durations
testOptions += Tests.Argument(TestFrameworks.ScalaTest, "-oDF"),
Expand Down Expand Up @@ -185,7 +181,7 @@ lazy val `alpakka-kafka` =
| testkit - framework for testing the connector
|
|Other modules:
| docs - the sources for generating https://doc.akka.io/docs/alpakka-kafka/current
| docs - the sources for generating https://doc.akka.io/libraries/alpakka-kafka/current
| benchmarks - compare direct Kafka API usage with Alpakka Kafka
|
|Useful sbt tasks:
Expand Down Expand Up @@ -342,7 +338,7 @@ lazy val docs = project
Preprocess / preprocessRules := Seq(
("https://javadoc\\.io/page/".r, _ => "https://javadoc\\.io/static/")
),
Paradox / siteSubdirName := s"docs/alpakka-kafka/${projectInfoVersion.value}",
Paradox / siteSubdirName := s"libraries/alpakka-kafka/${projectInfoVersion.value}",
paradoxGroups := Map("Language" -> Seq("Java", "Scala")),
paradoxProperties ++= Map(
"image.base_url" -> "images/",
Expand All @@ -352,11 +348,11 @@ lazy val docs = project
"javadoc.akka.kafka.base_url" -> "",
// Akka
"akka.version" -> akkaVersion,
"extref.akka.base_url" -> s"https://doc.akka.io/docs/akka/$AkkaBinaryVersionForDocs/%s",
"extref.akka.base_url" -> s"https://doc.akka.io/libraries/akka-core/$AkkaBinaryVersionForDocs/%s",
"scaladoc.akka.base_url" -> s"https://doc.akka.io/api/akka/$AkkaBinaryVersionForDocs/",
"javadoc.akka.base_url" -> s"https://doc.akka.io/japi/akka/$AkkaBinaryVersionForDocs/",
"javadoc.akka.link_style" -> "direct",
"extref.akka-management.base_url" -> s"https://doc.akka.io/docs/akka-management/current/%s",
"extref.akka-management.base_url" -> s"https://doc.akka.io/libraries/akka-management/current/%s",
// Kafka
"kafka.version" -> kafkaVersion,
"extref.kafka.base_url" -> s"https://kafka.apache.org/$KafkaVersionForDocs/%s",
Expand Down
6 changes: 3 additions & 3 deletions core/src/main/resources/reference.conf
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ akka.kafka.producer {
discovery-method = akka.discovery

# Set a service name for use with Akka Discovery
# https://doc.akka.io/docs/alpakka-kafka/current/discovery.html
# https://doc.akka.io/libraries/alpakka-kafka/current/discovery.html
service-name = ""

# Timeout for getting a reply from the discovery-method lookup
Expand Down Expand Up @@ -57,7 +57,7 @@ akka.kafka.consumer {
discovery-method = akka.discovery

# Set a service name for use with Akka Discovery
# https://doc.akka.io/docs/alpakka-kafka/current/discovery.html
# https://doc.akka.io/libraries/alpakka-kafka/current/discovery.html
service-name = ""

# Timeout for getting a reply from the discovery-method lookup
Expand Down Expand Up @@ -156,7 +156,7 @@ akka.kafka.consumer {
# then causes the Kafka consumer to follow its normal 'auto.offset.reset' behavior. For 'earliest', these settings
# allow the client to detect and attempt to recover from this issue. For 'none' and 'latest', these settings will
# only add overhead. See
# https://doc.akka.io/docs/alpakka-kafka/current/errorhandling.html#unexpected-consumer-offset-reset
# https://doc.akka.io/libraries/alpakka-kafka/current/errorhandling.html#unexpected-consumer-offset-reset
# for more information
offset-reset-protection {
# turns on reset protection
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -110,14 +110,14 @@ object ConsumerResetProtection {
log.warning(
s"Your last commit request $previouslyCommitted is more than the configured threshold from the last" +
s"committed offset ($committed) for $tp. See " +
"https://doc.akka.io/docs/alpakka-kafka/current/errorhandling.html#setting-offset-threshold-appropriately for more info."
"https://doc.akka.io/libraries/alpakka-kafka/current/errorhandling.html#setting-offset-threshold-appropriately for more info."
)
}
log.warning(
s"Dropping offsets for partition $tp - received an offset which is less than allowed $threshold " +
s"from the last requested offset (threshold: $threshold). Seeking to the latest known safe (committed " +
s"or assigned) offset: $committed. See " +
"https://doc.akka.io/docs/alpakka-kafka/current/errorhandling.html#unexpected-consumer-offset-reset" +
"https://doc.akka.io/libraries/alpakka-kafka/current/errorhandling.html#unexpected-consumer-offset-reset" +
"for more information."
)
consumer ! Seek(Map(tp -> committed.offset()))
Expand Down
4 changes: 2 additions & 2 deletions core/src/main/scala/akka/kafka/javadsl/Consumer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ object Consumer {
* This is useful when "at-least once delivery" is desired, as each message will likely be
* delivered one time but in failure cases could be duplicated.
*
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Producer.flowWithContext]].
*/
@ApiMayChange
Expand All @@ -198,7 +198,7 @@ object Consumer {
* This is useful when "at-least once delivery" is desired, as each message will likely be
* delivered one time but in failure cases could be duplicated.
*
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Producer.flowWithContext]].
*
* This variant makes it possible to add additional metadata (in the form of a string)
Expand Down
4 changes: 2 additions & 2 deletions core/src/main/scala/akka/kafka/javadsl/Producer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ object Producer {
*
* - [[akka.kafka.ProducerMessage.PassThroughMessage PassThroughMessage]] does not publish anything, and continues in the stream as [[akka.kafka.ProducerMessage.PassThroughResult PassThroughResult]]
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html).
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html).
*
* @tparam C the flow context type
*/
Expand Down Expand Up @@ -286,7 +286,7 @@ object Producer {
*
* - [[akka.kafka.ProducerMessage.PassThroughMessage PassThroughMessage]] does not publish anything, and continues in the stream as [[akka.kafka.ProducerMessage.PassThroughResult PassThroughResult]]
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html).
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html).
*
* Supports sharing a Kafka Producer instance.
*
Expand Down
4 changes: 2 additions & 2 deletions core/src/main/scala/akka/kafka/javadsl/Transactional.scala
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ object Transactional {
/**
* API MAY CHANGE
*
* This source is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* This source is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Transactional.flowWithOffsetContext]].
*/
@ApiMayChange
Expand Down Expand Up @@ -151,7 +151,7 @@ object Transactional {
* carries [[ConsumerMessage.PartitionOffset]] as context. The flow requires a unique `transactional.id` across all app
* instances. The flow will override producer properties to enable Kafka exactly-once transactional support.
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Transactional.sourceWithOffsetContext]].
*/
@ApiMayChange
Expand Down
12 changes: 6 additions & 6 deletions core/src/main/scala/akka/kafka/scaladsl/Consumer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ object Consumer {
/**
* Materialized value of the consumer `Source`.
*
* See [[https://doc.akka.io/docs/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
* See [[https://doc.akka.io/libraries/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
*/
trait Control {

Expand All @@ -36,7 +36,7 @@ object Consumer {
* already enqueued messages. It does not unsubscribe from any topics/partitions
* as that could trigger a consumer group rebalance.
*
* See [[https://doc.akka.io/docs/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
* See [[https://doc.akka.io/libraries/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
*
* Call [[#shutdown]] to close consumer.
*/
Expand All @@ -49,7 +49,7 @@ object Consumer {
* from enqueued messages can be handled.
* The actor will wait for acknowledgements of the already sent offset commits from the Kafka broker before shutting down.
*
* See [[https://doc.akka.io/docs/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
* See [[https://doc.akka.io/libraries/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
*/
def shutdown(): Future[Done]

Expand Down Expand Up @@ -93,7 +93,7 @@ object Consumer {
* one, so that the stream can be stopped in a controlled way without losing
* commits.
*
* See [[https://doc.akka.io/docs/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
* See [[https://doc.akka.io/libraries/alpakka-kafka/current/consumer.html#controlled-shutdown Controlled shutdown]]
*/
final class DrainingControl[T] private (control: Control, val streamCompletion: Future[T]) extends Control {

Expand Down Expand Up @@ -198,7 +198,7 @@ object Consumer {
* This is useful when "at-least once delivery" is desired, as each message will likely be
* delivered one time but in failure cases could be duplicated.
*
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html),
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html),
* [[Producer.flowWithContext]] and/or [[Committer.sinkWithOffsetContext]].
*/
@ApiMayChange
Expand All @@ -219,7 +219,7 @@ object Consumer {
* This is useful when "at-least once delivery" is desired, as each message will likely be
* delivered one time but in failure cases could be duplicated.
*
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html),
* It is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html),
* [[Producer.flowWithContext]] and/or [[Committer.sinkWithOffsetContext]].
*
* This variant makes it possible to add additional metadata (in the form of a string)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ object DiscoverySupport {
system.dynamicAccess.getClassFor[AnyRef]("akka.discovery.Discovery$") match {
case Failure(_: ClassNotFoundException | _: NoClassDefFoundError) =>
throw new IllegalStateException(
s"Akka Discovery is being used but the `akka-discovery` library is not on the classpath, it must be added explicitly. See https://doc.akka.io/docs/alpakka-kafka/current/discovery.html"
s"Akka Discovery is being used but the `akka-discovery` library is not on the classpath, it must be added explicitly. See https://doc.akka.io/libraries/alpakka-kafka/current/discovery.html"
)
case _ =>
}
Expand Down
4 changes: 2 additions & 2 deletions core/src/main/scala/akka/kafka/scaladsl/Producer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@ object Producer {
*
* - [[akka.kafka.ProducerMessage.PassThroughMessage PassThroughMessage]] does not publish anything, and continues in the stream as [[akka.kafka.ProducerMessage.PassThroughResult PassThroughResult]]
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html).
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html).
*
* @tparam C the flow context type
*/
Expand Down Expand Up @@ -286,7 +286,7 @@ object Producer {
*
* - [[akka.kafka.ProducerMessage.PassThroughMessage PassThroughMessage]] does not publish anything, and continues in the stream as [[akka.kafka.ProducerMessage.PassThroughResult PassThroughResult]]
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html).
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html).
*
* Supports sharing a Kafka Producer instance.
*
Expand Down
4 changes: 2 additions & 2 deletions core/src/main/scala/akka/kafka/scaladsl/Transactional.scala
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ object Transactional {
/**
* API MAY CHANGE
*
* This source is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* This source is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Transactional.flowWithOffsetContext]].
*/
@ApiMayChange
Expand Down Expand Up @@ -152,7 +152,7 @@ object Transactional {
* carries [[ConsumerMessage.PartitionOffset]] as context. The flow requires a unique `transactional.id` across all app
* instances. The flow will override producer properties to enable Kafka exactly-once transactional support.
*
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/docs/akka/current/stream/operators/Flow/asFlowWithContext.html)
* This flow is intended to be used with Akka's [flow with context](https://doc.akka.io/libraries/akka-core/current/stream/operators/Flow/asFlowWithContext.html)
* and [[Transactional.sourceWithOffsetContext]].
*/
@ApiMayChange
Expand Down
Loading

0 comments on commit b54a6df

Please sign in to comment.