Skip to content

Commit

Permalink
Fix links to tests
Browse files Browse the repository at this point in the history
  • Loading branch information
krasserm committed Feb 1, 2015
1 parent 3f92c88 commit 76351b3
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Dependencies
Event batch processing
----------------------

With `akka-analytics-cassandra` you can expose and process events written by **all** persistent actors as [resilient distributed dataset](http://spark.apache.org/docs/latest/programming-guide.html#resilient-distributed-datasets-rdds) (`RDD`). It uses the [Spark Cassandra Connector](https://github.com/datastax/spark-cassandra-connector) to fetch data from the Cassandra journal. Here's a primitive example (details [here](https://github.com/krasserm/akka-analytics/blob/master/akka-analytics-kafka/src/test/scala/akka/analytics/kafka/IntegrationSpec.scala)):
With `akka-analytics-cassandra` you can expose and process events written by **all** persistent actors as [resilient distributed dataset](http://spark.apache.org/docs/latest/programming-guide.html#resilient-distributed-datasets-rdds) (`RDD`). It uses the [Spark Cassandra Connector](https://github.com/datastax/spark-cassandra-connector) to fetch data from the Cassandra journal. Here's a primitive example (details [here](https://github.com/krasserm/akka-analytics/blob/master/akka-analytics-cassandra/src/test/scala/akka/analytics/cassandra/IntegrationSpec.scala)):

```scala
import org.apache.spark.rdd.RDD
Expand Down Expand Up @@ -60,7 +60,7 @@ Events for a given `persistenceId` are partitioned across nodes in the Cassandra
Event stream processing
-----------------------

With `akka-analytics-kafka` you can expose and process events written by **all** persistent actors (more specific, from any [user-defined topic](https://github.com/krasserm/akka-persistence-kafka#user-defined-topics)) as [discretized stream](http://spark.apache.org/docs/latest/streaming-programming-guide.html#dstreams) (`DStream`). Here's a primitive example (details [here](https://github.com/krasserm/akka-analytics/blob/master/akka-analytics-cassandra/src/test/scala/akka/analytics/cassandra/IntegrationSpec.scala)):
With `akka-analytics-kafka` you can expose and process events written by **all** persistent actors (more specific, from any [user-defined topic](https://github.com/krasserm/akka-persistence-kafka#user-defined-topics)) as [discretized stream](http://spark.apache.org/docs/latest/streaming-programming-guide.html#dstreams) (`DStream`). Here's a primitive example (details [here](https://github.com/krasserm/akka-analytics/blob/master/akka-analytics-kafka/src/test/scala/akka/analytics/kafka/IntegrationSpec.scala)):

```scala
import org.apache.spark.SparkConf
Expand Down

0 comments on commit 76351b3

Please sign in to comment.