Packages

package writer

Implicit conversions between

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. writer
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class DStreamKafkaWriter[T] extends KafkaWriter[T] with Serializable

    Class used for writing DStreams to Kafka

  2. class DatasetKafkaWriter[T] extends KafkaWriter[T] with Serializable

    Class used for writing Datasets to Kafka

  3. abstract class KafkaWriter[T] extends Serializable

    Class used to write DStreams, RDDs and Datasets to Kafka

    Class used to write DStreams, RDDs and Datasets to Kafka

    Example usage:

    import com.github.benfradet.spark.kafka.writer.KafkaWriter._
    import org.apache.kafka.common.serialization.StringSerializer
    
    val topic = "my-topic"
    val producerConfig = Map(
      "bootstrap.servers" -> "127.0.0.1:9092",
      "key.serializer" -> classOf[StringSerializer].getName,
      "value.serializer" -> classOf[StringSerializer].getName
    )
    
    val dStream: DStream[String] = ...
    dStream.writeToKafka(
      producerConfig,
      s => new ProducerRecord[String, String](topic, s)
    )
    
    val rdd: RDD[String] = ...
    rdd.writeToKafka(
      producerConfig,
      s => new ProducerRecord[String, String](localTopic, s)
    )
    
    val dataset: Dataset[Foo] = ...
    dataset.writeToKafka(
      producerConfig,
      f => new ProducerRecord[String, String](localTopic, f.toString)
    )
    
    val dataFrame: DataFrame = ...
    dataFrame.writeToKafka(
      producerConfig,
      r => new ProducerRecord[String, String](localTopic, r.get(0).toString)
    )

    It is also possible to provide a callback for each write to Kafka.

    This is optional and has a value of None by default.

    Example Usage:

    @transient val log = org.apache.log4j.Logger.getLogger("spark-kafka-writer")
    
    val dStream: DStream[String] = ...
    dStream.writeToKafka(
      producerConfig,
      s => new ProducerRecord[String, String](topic, s),
      Some(new Callback with Serializable {
        override def onCompletion(metadata: RecordMetadata, e: Exception): Unit = {
          if (Option(e).isDefined) {
            log.warn("error sending message", e)
          } else {
            log.info(s"write succeeded, offset: ${metadata.offset()")
          }
        }
      })
    )
  4. class RDDKafkaWriter[T] extends KafkaWriter[T] with Serializable

    Class used for writing RDDs to Kafka

Value Members

  1. implicit def dStreamToKafkaWriter[T, K, V](dStream: DStream[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]

    Convert a DStream to a KafkaWriter implicitly

    Convert a DStream to a KafkaWriter implicitly

    dStream

    DStream to be converted

    returns

    KafkaWriter ready to write messages to Kafka

  2. implicit def datasetToKafkaWriter[K, V](dataFrame: DataFrame): KafkaWriter[Row]

    Convert a DataFrame to a KafkaWriter implicitly

    Convert a DataFrame to a KafkaWriter implicitly

    dataFrame

    DataFrame to be converted

    returns

    KafkaWriter ready to write messages to Kafka

  3. implicit def datasetToKafkaWriter[T, K, V](dataset: Dataset[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]

    Convert a Dataset to a KafkaWriter implicitly

    Convert a Dataset to a KafkaWriter implicitly

    dataset

    Dataset to be converted

    returns

    KafkaWriter ready to write messages to Kafka

  4. implicit def rddToKafkaWriter[T, K, V](rdd: RDD[T])(implicit arg0: ClassTag[T]): KafkaWriter[T]

    Convert a RDD to a KafkaWriter implicitly

    Convert a RDD to a KafkaWriter implicitly

    rdd

    RDD to be converted

    returns

    KafkaWriter ready to write messages to Kafka

  5. object KafkaProducerCache

    Cache of KafkaProducers

Inherited from AnyRef

Inherited from Any

Ungrouped