Thursday, December 14, 2017

Debugging Spike with CLion/CMake

Another quick entry.  I released a template project on GitHub that allows you to build, run, and debug spike, the RISC-V ISA simulation, using CLion.

https://github.com/edcote/spike-cmake

Clone, import the template, and enjoy!


Wednesday, December 13, 2017

CLion Kool-Aid

Doesn't it feel good to purchase a product/service from a company and be damn happy?
I have been resisting purchasing this product for a few years, heck; I even tried to pirate it (for personal use) at some point.

CLion!

https://www.jetbrains.com/clion/

That's all.  CLion is an incredible tool for C++ development, and you too should buy it.  Integration is seamless with Linux/Unix development tools: git, gdb, cmake, clang, etc. Most importantly, use of the product doesn't interfere with others that choose more-traditional development styles.

Monday, November 27, 2017

Happy Thanksgiving and Gratefuls

Happy belated Thanksgiving!

I am very thankful and happy this year.  Grateful for the courage of temporarily leaving corporate life to work on my projects with the aim to launch a business.  I have spent the bulk of my career at Fortune 100 companies and, frankly, do not hate it.  It is incredibly rewarding to work alongside top engineers and management teams.  I arrive at work this morning grateful to have hit "soft reset" on my career.  The next year, or so, will be about collaboration and my gradual re-introduction into the corporate world.

Wednesday, November 22, 2017

Connecting Zeppelin, Spark, and MongoDB

It took me a few hours to connect Zeppelin, Spark, and MongoDB.  I didn't find a solution to this problem online; thus the short entry.

First, I added a dependency to the MongoDB Connector for Spark in my Zeppelin notebook.


%dep
z.reset()
z.load("org.mongodb.spark:mongo-spark-connector_2.10:2.2.0")

%spark
import com.mongodb.spark._
import com.mongodb.spark.rdd.MongoRDD
val rdd = MongoSpark.load(sc)

This gave :

java.lang.IllegalArgumentException: Missing database name. Set via the 'spark.mongodb.input.uri' or 'spark.mongodb.input.database' property

Then, after realizing,  that you cannot dynamically reconfigure the SparkContext.  I used the GUI to set the property.


It is working well now!

rdd: com.mongodb.spark.rdd.MongoRDD[org.bson.Document] = MongoRDD[0] at RDD at MongoRDD.scala:47

Tuesday, November 7, 2017

Early November update: DARPA, GCE, open source release

It has been a busy few weeks, and it is not going to get better in November.  No big deal because I am enjoying startup life. My biggest challenge is to not work too much.  I am in the process of applying for a DARPA research grant for the company

I have been looking to scale my design environment to the cloud.  I followed a friend's recommendation and eval'ed Google Compute Engine. Wow! I was up and running in seconds.  I intended to buy an additional workstation to serve as a build server, but the $20,000 in free startup credits have me thinking otherwise.

I open sourced my design environment to support the effort.  Here is a link to the public repository https://bitbucket.org/ecote/nade.

Here is a picture of an FPGA that I produced.


I also recently subscribed to Grammarly.  I am incredibly impressed.