question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Provide a java-friendly API

See original GitHub issue
  • work on JavaDStream
  • not (?) rely on implicits

Issue Analytics

  • State:open
  • Created 6 years ago
  • Comments:18 (14 by maintainers)

github_iconTop GitHub Comments

1reaction
BenFradetcommented, Jun 15, 2017

Have you tried having AbstractFunction1 extend Serializable:

Function1<String, ProducerRecord<String, String>> f = new AbstractFunction1<String, ProducerRecord<String, String>>() extends Serializable {
        @Override
        public ProducerRecord<String, String> apply(final String s) {
            return new ProducerRecord<>("my-topic", s);
        }
    };

?

0reactions
rfmvlccommented, Dec 20, 2017

Hello Ben,

It doesn’t work on Java RDDs? -> @BenFradet work on JavaDStream…

@sunone5 @huylv

Do you have any working sample on Java please?

I’ve got a mismatch error in the function:

 required: scala.collection.immutable.Map<String,Object>,Function1<String,ProducerRecord<K,V>>,Option<Callback>
  found: scala.collection.mutable.Map<String,Object>,<anonymous SerializableFunc<String,ProducerRecord<String,String>>>,Option<Object>
  reason: no instance(s) of type variable(s) A,B exist so that scala.collection.mutable.Map<A,B> conforms to scala.collection.immutable.Map<String,Object>
  where K,V,T,A,B are type-variables:
    K extends Object declared in method <K,V>writeToKafka(scala.collection.immutable.Map<String,Object>,Function1<T,ProducerRecord<K,V>>,Option<Callback>)
    V extends Object declared in method <K,V>writeToKafka(scala.collection.immutable.Map<String,Object>,Function1<T,ProducerRecord<K,V>>,Option<Callback>)
    T extends Object declared in class KafkaWriter
    A extends Object declared in method <A,B>mapAsScalaMap(java.util.Map<A,B>)
    B extends Object declared in method <A,B>mapAsScalaMap(java.util.Map<A,B>)

with this sample code

imports//


import com.github.benfradet.spark.kafka.writer.DStreamKafkaWriter;
import com.github.benfradet.spark.kafka.writer.KafkaWriter;
import com.github.benfradet.spark.kafka.writer.RDDKafkaWriter;
import org.apache.commons.lang.math.RandomUtils;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.log4j.Level;
import org.apache.log4j.Logger;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.SparkSession;
import scala.Function1;
import scala.Option;
import scala.collection.JavaConversions;

import java.util.Calendar;
import java.util.HashMap;
import java.util.Locale;
import java.util.Map;


 KafkaWriter<String> kafkaWriter = new RDDKafkaWriter<>(lines.rdd(), scala.reflect.ClassTag$.MODULE$.apply(String.class));

        kafkaWriter.writeToKafka(JavaConversions.mapAsScalaMap(producerConfig),
                new SerializableFunc<String, ProducerRecord<String,String>>() {
                    @Override
                    public ProducerRecord<String, String> apply(final String s) {
                        return new ProducerRecord<>("sometopic", s);
                    }
                }
                ,
                Option.empty()
        );




import scala.Serializable;
import scala.runtime.AbstractFunction1;


abstract class SerializableFunc<T, R> extends AbstractFunction1<T, R> implements Serializable {}


Cheers!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Provide a Java-friendly way to use the Context API #115
Currently, the context API of the FEEL engine has only a Scala API. It is not so easy to use this API with...
Read more >
How to provide a Java-friendly interface for my Scala code?
I'd have this Scala code: javaf(b: ArrayList[Pair[String, Int]) = scalaf(b.map(p => (p.getLeft, p.getRight)). Then Java people would call ...
Read more >
Package org.apache.spark.streaming.api.java
A Java-friendly interface to a DStream of key-value pairs, which provides extra methods like reduceByKey and join . JavaPairInputDStream<K,V>.
Read more >
Visual Recognition for Java - A Java-Friendly ML API at ...
JSR381 – Visual Recognition for Java – A Java-Friendly ML API at jChampions Conference. Conferences, Java, Machine Learning, Other / By Zoran Sevarac....
Read more >
twsapi@groups.io | Java friendly API brokerages
2) What brokerages offer truly Java friendly API's (other than IB) as their higher margins and questionable termination policies force me to
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found