It took me a few hours to connect Zeppelin, Spark, and MongoDB. I didn't find a solution to this problem online; thus the short entry.
First, I added a dependency to the MongoDB Connector for Spark in my Zeppelin notebook.
This gave :
java.lang.IllegalArgumentException: Missing database name. Set via the 'spark.mongodb.input.uri' or 'spark.mongodb.input.database' property
Then, after realizing, that you cannot dynamically reconfigure the SparkContext. I used the GUI to set the property.
It is working well now!
rdd: com.mongodb.spark.rdd.MongoRDD[org.bson.Document] = MongoRDD at RDD at MongoRDD.scala:47