package writer
Implicit conversions between
- DStream -> KafkaWriter
- RDD -> KafkaWriter
- Dataset -> KafkaWriter
- DataFrame -> KafkaWriter
- Alphabetic
- By Inheritance
- writer
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
class
DStreamKafkaWriter[T] extends KafkaWriter[T] with Serializable
Class used for writing DStreams to Kafka
-
class
DatasetKafkaWriter[T] extends KafkaWriter[T] with Serializable
Class used for writing Datasets to Kafka
-
abstract
class
KafkaWriter[T] extends Serializable
Class used to write DStreams, RDDs and Datasets to Kafka
Class used to write DStreams, RDDs and Datasets to Kafka
Example usage:
import com.github.benfradet.spark.kafka.writer.KafkaWriter._ import org.apache.kafka.common.serialization.StringSerializer val topic = "my-topic" val producerConfig = Map( "bootstrap.servers" -> "127.0.0.1:9092", "key.serializer" -> classOf[StringSerializer].getName, "value.serializer" -> classOf[StringSerializer].getName ) val dStream: DStream[String] = ... dStream.writeToKafka( producerConfig, s => new ProducerRecord[String, String](topic, s) ) val rdd: RDD[String] = ... rdd.writeToKafka( producerConfig, s => new ProducerRecord[String, String](localTopic, s) ) val dataset: Dataset[Foo] = ... dataset.writeToKafka( producerConfig, f => new ProducerRecord[String, String](localTopic, f.toString) ) val dataFrame: DataFrame = ... dataFrame.writeToKafka( producerConfig, r => new ProducerRecord[String, String](localTopic, r.get(0).toString) )
It is also possible to provide a callback for each write to Kafka.
This is optional and has a value of None by default.
Example Usage:
@transient val log = org.apache.log4j.Logger.getLogger("spark-kafka-writer") val dStream: DStream[String] = ... dStream.writeToKafka( producerConfig, s => new ProducerRecord[String, String](topic, s), Some(new Callback with Serializable { override def onCompletion(metadata: RecordMetadata, e: Exception): Unit = { if (Option(e).isDefined) { log.warn("error sending message", e) } else { log.info(s"write succeeded, offset: ${metadata.offset()") } } }) )
-
class
RDDKafkaWriter[T] extends KafkaWriter[T] with Serializable
Class used for writing RDDs to Kafka
Value Members
-
implicit
def
dStreamToKafkaWriter[T, K, V](dStream: DStream[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]
Convert a DStream to a KafkaWriter implicitly
Convert a DStream to a KafkaWriter implicitly
- dStream
DStream to be converted
- returns
KafkaWriter ready to write messages to Kafka
-
implicit
def
datasetToKafkaWriter[K, V](dataFrame: DataFrame): KafkaWriter[Row]
Convert a DataFrame to a KafkaWriter implicitly
Convert a DataFrame to a KafkaWriter implicitly
- dataFrame
DataFrame to be converted
- returns
KafkaWriter ready to write messages to Kafka
-
implicit
def
datasetToKafkaWriter[T, K, V](dataset: Dataset[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]
Convert a Dataset to a KafkaWriter implicitly
Convert a Dataset to a KafkaWriter implicitly
- dataset
Dataset to be converted
- returns
KafkaWriter ready to write messages to Kafka
-
implicit
def
rddToKafkaWriter[T, K, V](rdd: RDD[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]
Convert a RDD to a KafkaWriter implicitly
Convert a RDD to a KafkaWriter implicitly
- rdd
RDD to be converted
- returns
KafkaWriter ready to write messages to Kafka
-
object
KafkaProducerCache
Cache of KafkaProducers