Showing posts with label Python. Show all posts
Showing posts with label Python. Show all posts

Monday, June 8, 2015

We^D^D Monday Night Hack #16 - N/A

I reached the anti-climactic end of the Google foobar challenge.  Quite unfortunate.  Last night, I was working on a level 3 problem (spring_cleaning).  I had 4 of 5 test cases passing and, fast forward a day, had a good hunch idea on why the last test case was failing.


I found out after the fact that Google was using this challenge as a recruiting tool.  Brilliant!  I honestly had no idea.  I am not sure how it works, but I would assume that breaking level 5 in the predefined period of time might indicate an exceptional computer scientist - at which point they'd engage their army of recruiters :P  I am humbled by those that can pull that off.

Tomorrow, I will be participating in Qualcomm's (my employer) first ever Data Mining Competition.  I'll spend the rest of the evening working on CS100.1x in preparation.

Wednesday, June 3, 2015

Wednesday Night Hack #15 - foobar and CS100.1x

Tonight, I spent time on Google's foobar coding challenge.  It was good opportunity to hone my Python skills.  I'll update once I crack level 3.


I have also registered and started another massively online course.  This time CS100.1x, Introduction to Big Data with Apache Spark offered by BerkeleyX.  Time commitment for this one looks minimal. The syllabus estimates 25 hours over 5 weeks.  I'll update with a recap afterwards.

Monday, May 4, 2015

Comments about MITx 6.00.2x

This was my first massively online course that I finished, to completion.

What were my thoughts?  Mixed.

To be honest, it was hard to stay motivated throughout.  The course also suffered from a difficulty ramp, first 2/3 was easy, then the last 1/3 hits you hard.  Difficulty level overall was pretty for low me.

The finger exercises were great because they give you immediate feedback on whether what you learned in the lecture was sufficient.

The problem sets suffered from one major problem.  In addition to the difficulty ramp, if for some reason, you have trouble passing the input vectors for the first part of the problem set, then you were toast for the rest. That was frustrating.  A frustration which could have been mitigated if I had spent time connecting with other students in the discussion forums.

Honestly, the greatest thing about this course is that I gave no concern whatsoever about the grade I'd eventually receive.  Instead, I focused on learning what I could.  What a refreshing and liberating feeling.

What I got out from the course was a introduction (or re-introduction) to some CS topics that weren't properly covered during my EE/CE studies.  The subject material was presented using Python, which I thought to be a wonderful.

The course succeeded in it's goal of teaching the basics of data science.  I now have a better understanding of stochastic programs / monte carlo simulations (using randomness in computations), data visualization and curve fitting using pylab, knapsack and graph optimization problems, and machine learning - well, sort of for that one.

I personally learn best from working on open ended problems, not contrived examples.  The course could be improved by adding such a component.

As a final note, because of the lack of an individual project component, there's a good chance I'll forget most of the details of what I learned :)  The redeeming factor is, without question, the wonderful textbook that I can go back and refer to.  The process of sitting through all the lectures and trying (but not completing) all the problem sets made it such that my copy of the textbook is nicely highlighted throughout.  This will make it easier for me if I ever needed to write code to, say..., apply machine learning to making sense of functional coverage data.

Wednesday, February 25, 2015

Wednesday Night Hack - Next Two Months

I'll be busy in my spare time for the next 9 weeks.  I decided to sign up for MITx course: "Introduction to Computational Thinking and Data Science".  The cost for verified certificate track was a donation of $50, for which I did not mind paying whatsoever.

Here is the list of lecture topics.  It should be fun!!

  • Plotting
  • Simulations and Random Walks
  • Probability
  • Stochastic Programming and Hashing
  • Monte Carlo Simulations
  • Using Randomness to Solve Non-Random Problems
  • Curve Fitting
  • Knapsack Problem
  • Graphs and Graph Optimization
  • Machine Learning
  • Statistical Fallacies
  • Research videos

Wednesday, February 18, 2015

Wednesday Night Hack #11 - Experimenting with PyEDA

Not much updates lately.  Work and life are good.  I have probably been working on this for about 2 weeks.

Here is a short write up on an experiment I did using PyEDA for the purpose of developing a "poor-man's" random constraint solver using Python.

At it's core, PyEDA is a library that provides support for the specification of complex boolean functions.

It provides a very clean interface to PicoSAT, a popular SAT solver.

The first realization I made was that PyEDA did not support integer values.  With the help of the author of PyEDA, the following solution was devised.

You can specify integers using function arrays :

class Int:
    def __init__(self, name, width=3, value=None):
        self.name = name
        self.width = width
        if value != None:
            self.farray = int2exprs(value, self.width)  # a real value
        else:
            self.farray = exprvars(name, width)  # empty variables

A function is array is nothing but an array of boolean variables.

The next step is to add operators.

    def __eq__(self, other):
        name = "{}eq{}".format(self.name, other.name)
        f = Int(name, self.width)
        f.farray = farray([Equal(i, j) for i, j in zip(self.farray, other.farray)])
        return f

    def __add__(self, other):
        sum, carry = ripple_carry_add(self.farray, other.farray)
        name = "{}add{}".format(self.name, other.name)
        f = Int(name, self.width)
        f.farray = farray([i & ~(carry[-1] & carry[-2]) for i in sum])  # this is completely incorrect, need to fix overflow issue
        return f

A wrapper to the SAT solve function call and a __str__ method :

    def satisfy_all(self):
       return self.farray.uand().satisfy_all()

    def __str__(self):
        return self.name + " : " + str(self.farray)
Finally, we need a way to convert the result from the SAT solver back to our Int class.


    def from_soln(self, soln):
        value = 0
        for boolvar in sorted(soln.keys(), reverse=True):
            tmp = str(boolvar)
            if tmp.startswith(self.name):
                value = value | (int(soln[boolvar]) * pow(2,int(tmp[-2])))
        return value

I'll conclude this knowledge share post with 3 simple examples :

1. A == B

This works very well.

>>> a = Int('A', 3)
>>> b = Int('B', 3)
>>> f = (a == b)
>>> for soln in f.satisfy_all():
>>>     print("a="+str(a.from_soln(soln))+" b="+str(b.from_soln(soln)))
a=0 b=0
a=4 b=4
a=2 b=2
a=6 b=6
a=1 b=1
a=5 b=5
a=3 b=3
a=7 b=7

2. F == (A + B)

This works well, but there is an overflow bug that I have chosen not to fix.

>>> a = Int('A', 3)
>>> b = Int('B', 3)
>>> f = Int('F', 3)
>>> g = (f == (a + b))
>>> for soln in g.satisfy_all():
>>>     print("a="+str(a.from_soln(soln))+" b="+str(b.from_soln(soln))+" f="+str(f.from_soln(soln)))

a=0 b=0 f=0
a=0 b=4 f=4
a=0 b=2 f=2
a=0 b=6 f=6
a=0 b=1 f=1
[..]
a=2 b=6 f=0 <-- overflow bug

2. F == (A + B), F == 3

The same overflow bug happens here, but I also identified a more serious issue.  Since F is kept constant, the SAT solver does not return it's var.

a = Int('A', 3)
b = Int('B', 3)
f = Int('F', 3, 3)
g = (f == (a + b))
for soln in g.satisfy_all():
    print("a="+str(a.from_soln(soln))+" b="+str(b.from_soln(soln))+" f="+str(f.from_soln(soln)))


a=0 b=3 f=0 <-- f is not returned by SAT solver
a=2 b=1 f=0
a=1 b=2 f=0
a=4 b=7 f=0 <-- overflow bug
[..]

This concludes my experiment with PyEDA.  It was an worthwhile and interesting experiment, but it's time to move on.

As far as "poor-man's" constraint solver goes, this doesn't come close.  I would point interested individuals to the CRAVE project (or, commercial tools!).

Wednesday, January 21, 2015

Wednesday Night Hack #9 - Generation of constrained random stimulus using python-constraint with bit-level approach

One of the issues faced my in last experiment with constraint programming for random stimulus generation was that the constraint solver library only accepted input variables of finite domain.  As it turns out, all available constraint solvers share this same constraint.  The state space explodes when specifying 16-bit variables and it became difficult to solve a simple constraint.  For the purpose of stimulus generation, it is absolutely necessary to randomize large payloads;128 or 256 bit wide buses is not uncommon.  One approach is model one 2-bit variable for each bit of the random variable.  As it turns out, it works well.  This is commonly called, if I am not mistaken, a bit level approach.

On some other related notes :

I ordered this book on constraint based verification.  I suspect it will heavy on the computer science, but for $17.99 I couldn't pass it up.  I will give it a quick read.

Also, I stumbled on this web site: http://www.personalkanban.com/ - be forewarned, it's new "agey", similar concepts to "GTD".  Well worth a read through if you are interested in personal growth.  Quoting: "Personal Kanban gives us clarity in our work and our lives by visualizing those tasks, expectations, and commitments we have and helping us prioritize and complete.  With only two simple rules, visualize your work and limit your work in progress."

Without further ado, here is the source code to tonight's experiment.  Enjoy!


#!/usr/bin/env python2.7

from constraint import *
from random import seed


class Solver:
    def __init__(self):
        self.solver = Problem()
        self.solver.setSolver(MinConflictsSolver(100))

    def add_variable(self, name, length):
        # Each bit of input variable is in binary domain, internal format is name[bit position]
        variables = ['{}[{}]'.format(name, i) for i in range(length)]
        for variable in variables:
            self.solver.addVariable(variable, [0, 1])

    def add_constraint(self, *args):
        self.solver.addConstraint(*args)

    def randomize(self):
        self.solution = self.solver.getSolution()

    def get_variable(self, name):
        # generate decimal representation from internal bitwise representation
        remap = 0
        for variable in self.solution:
            if name in variable:
                index = variable.replace('[', '_').replace(']', '').split('_')[1]
                remap += 2**int(index) * int(self.solution[variable])
        return remap

#seed(1)  # fix random seed, if ndeed

solver = Solver()  # init solver object

solver.add_variable('xx', 32)  # rand bit [31:0] xx
solver.add_variable('yy', 15)  # rand bit [15:0] yy

# x is EVEN
solver.add_constraint(lambda x: x == 0, ['xx[0]'])  # x is EVEN
# y is ODD
solver.add_constraint(lambda x: x == 1, ['yy[0]'])  # y is EVEN

# solve the problem
solver.randomize()

# return random solutions for variable xx and yy
print(solver.get_variable('xx'))
print(solver.get_variable('yy'))

Saturday, January 3, 2015

Mako templates for Verilog, now released

I cleaned up and published code I presented earlier this year on the use of Mako templates for Verilog.

Here is a link to the repository: https://bitbucket.org/ecote/vmako/overview

Update to SimpleMIPS to use elftools, improved tracing, and new instructions

In the spirit of the new year, I am publishing some code that I have laying around on my workstation.

SimpleMIPS is an existing Python-based instruction set simulator for MIPS.  I wrote some code to replace its roll-your-own ELF parser with code from elftools library, to add simple for instruction trace generation, and add support for handful of new instructions.

Here is a link to my the repository : https://bitbucket.org/ecote/miss/overview

Wednesday, November 12, 2014

Wednesday Night Hack #8 - Mako for Verilog, Moving on ...

Using Mako templates language for hardware design was an interesting experiment.  It was definitely a fun hack.  I am turning my attention away from this pet project to a new tool developed
by engineers at Berkeley: Chisel.

Here is a link to find out more: https://chisel.eecs.berkeley.edu/

Wednesday, October 29, 2014

Wednesday Night Hack #7 - More Experiments with Mako for Verilog

This is turning out to be a very fun experiment.  In about 150 lines of Python code, I have put together a very nice tool for Verilog code generation.  If there is interest in the source code, please send me an e-mail.

On an unrelated note, I recently stumbled upon the following website: http://regex101.com/#python.  It is a wonderfully written online regex tester, 5/5 stars and highly recommended.

Here is the template code :

top.v.mako :

<%! from vmako import connect, ev,instance,log2 %>

module top;
bit clk = 0;

## instantiate 2x 'w'-bit wide barrel shifters
% for i in range(0, 2):
## define local variable inst
<% inst = 'u{}_bshf_{}'.format(i,w) %>
${instance('$HOME/raid/vmako/templates/bshift.v.mako',
           inst,
           w=w
          )}
## connect dest <- src
${connect((inst, 'clk'), ('', 'clk'))}
% endfor

## example connect statement between modules
${connect(('u1_bshf_'+str(w), 'in'), ('u0_bshf_'+str(w), 'out'))}

endmodule

bshift.v.mako :

<%! from vmako import connect,ev,instance,log2 %>

module bshift_${w} (
  input clk,
  input [${ev(w-1)}:0] in,
  input [${ev(log2(w)-1)}:0] cnt,
  output reg [${ev(w-1)}:0] out
);

% for i in range(1, w):
wire [${ev(w-1)}:0] out_${i} = {in[${ev(w-i-1)}:0], in[${ev(w-1)}:${ev(w-i)}]};
% endfor

always @(*)
  case (cnt)
% for i in range(1, w):
    ${i}: out = out_${i};
% endfor
    default: out = in;
  endcase

${instance('$HOME/raid/vmako/templates/dff.v.mako',
           "u_dff",
           w=w
          )}
endmodule

dff.v.mako :

<%! from vmako import ev,log2 %>

module dff_${w} (
  input clk,
  input [${ev(w-1)}:0] d,
  output reg [${ev(w-1)}:0] q
);

always @(posedge clk)
  q <= d;

endmodule

Here is the generated code :

top.v :

module top;

bit clk = 0;

// ---- instance:u0_bshf_4 [begin] ----
wire u0_bshf_4_clk;
wire [3:0] u0_bshf_4_in;
wire [1:0] u0_bshf_4_cnt;
wire [3:0] u0_bshf_4_out;

bshift_4 u0_bshf_4 (
 .clk(u0_bshf_4_clk)
,.in(u0_bshf_4_in)
,.cnt(u0_bshf_4_cnt)
,.out(u0_bshf_4_out)
);
// ---- instance:u0_bshf_4 [end] ----

// connect:dest=('u0_bshf_4', 'clk') src=('', 'clk')
assign u0_bshf_4_clk = clk;

// ---- instance:u1_bshf_4 [begin] ----
wire u1_bshf_4_clk;
wire [3:0] u1_bshf_4_in;
wire [1:0] u1_bshf_4_cnt;
wire [3:0] u1_bshf_4_out;

bshift_4 u1_bshf_4 (
 .clk(u1_bshf_4_clk)
,.in(u1_bshf_4_in)
,.cnt(u1_bshf_4_cnt)
,.out(u1_bshf_4_out)
);
// ---- instance:u1_bshf_4 [end] ----

// connect:dest=('u1_bshf_4', 'clk') src=('', 'clk')
assign u1_bshf_4_clk = clk;

// connect:dest=('u1_bshf_4', 'in') src=('u0_bshf_4', 'out')
assign u1_bshf_4_in = u0_bshf_4_out;

endmodule

bshift.v :

module bshift_4 (
  input clk,
  input [3:0] in,
  input [1:0] cnt,
  output reg [3:0] out
);

wire [3:0] out_1 = {in[2:0], in[3:3]};
wire [3:0] out_2 = {in[1:0], in[3:2]};
wire [3:0] out_3 = {in[0:0], in[3:1]};

always @(*)
  case (cnt)
    1: out = out_1;
    2: out = out_2;
    3: out = out_3;
    default: out = in;
  endcase

// ---- instance:u_dff [begin] ----
wire u_dff_clk;
wire [3:0] u_dff_d;
wire [3:0] u_dff_q;

dff_4 u_dff (
 .clk(u_dff_clk)
,.d(u_dff_d)
,.q(u_dff_q)
);
// ---- instance:u_dff [end] ----

endmodule

dff.v :

module dff_4 (
  input clk,
  input [3:0] d,
  output reg [3:0] q
);

always @(posedge clk)
  q <= d;

endmodule

Wednesday, October 15, 2014

Wednesday Night Hack #6 - Verilog Code Generation using Python and Mako

I spent some time this evening playing with Python and Mako for the purposes of Verilog code generation.  The combination of tools is pretty sweet if you ask me.  Here is a worked out example for a parameterizable barrel shifter.

Mako template:

<%!
import math
def log2(x): return int(math.log(x, 2))
def e(x): return eval(str(x))
%>
module bshift_w${w} (
  input [${e(w-1)}:0] in,
  input [${e(log2(w)-1)}:0] shift,
  output reg [${e(w-1)}:0] out
);

% for i in range(1, w):

  wire [${e(w-1)}:0] out_${i} = {in[${e(w-i-1)}:0], in[${e(w-1)}:${e(w-i)}]};
% endfor

  always @* begin

    out = in;
    case (shift)
% for i in range(1, w):
      ${i}: out = out_${i};
% endfor
    endcase
  end

endmodule


Verilog output (w=4):

module bshift_w4 (
  input [3:0] in,
  input [1:0] shift,
  output reg [3:0] out
);

  wire [3:0] out_1 = {in[2:0], in[3:3]};
  wire [3:0] out_2 = {in[1:0], in[3:2]};
  wire [3:0] out_3 = {in[0:0], in[3:1]};

  always @* begin
    out = in;
    case (shift)
      1: out = out_1;
      2: out = out_2;
      3: out = out_3;
    endcase
  end

endmodule

Wednesday, September 17, 2014

Wednesday Night Hack #5 - Update

Life happened.  We bought a house in Silicon Valley.  Work ramped up, code freeze and verification complete of next generation Adreno GPU is on the horizon (being careful to be intentionally vague here).  Thoroughly enjoyed the end of summer by camping and even attended my first personal development workshop weekend.

Last week, I checked in some code to my GIT repository https://bitbucket.org/ecote.  I released an updated version of my build and run flow (barf) and associated workspace generator (wsgen).  An example BARF script has also been released: analyze.barf.  At the moment, I am not motivated to document the operation of the code or describe any use cases.  The main reason for this is because I do not know the right forum to publish the work (blog, conference paper/presentation, YouTube, etc.).  In any case, these small pet projects are about having fun, no rush ...

Wednesday, June 18, 2014

BARF on BitBucket

I created a GIT repository on bitbucket.org for BARF.  I've committed the initial version of the flow and an example script.  Please contact me if there are questions.  More later.

https://bitbucket.org/ecote/barf

Thursday, June 12, 2014

Wednesday Night Hack #3 - BARF is done

I am done working on my Python-based build and run flow.

While doing this hack, I took a small detour and evaluated SCons for use with EDA tools.  I was successful in understanding the declarative nature of the tool and even wrote some custom builders, however, it quickly became clear that I was trying to fit a square peg into a round hole and abandoned the effort.  I was hoping that I could adopt a tried-and-tested solution and would love to hear from others who have been successful, if any.

The feature list for BARF is limited.  It has the ability to collect groups of files into components and provides a wrapper to execute commands.

Each component is specified using a YAML file.  The component contains a list of files, options, and requires.  Requires are needed to indicate parent-child relationship between components.  For example, a block may require a ram component or the chip-level wrapper requires all block components.
name: led
files: [led.v]
options: []
requires: [ram]
I will point out that the use of YAML is a reversal from the previous blog post where I experienced with making the mechanism to declare a component be a Python script itself.

I kept the API for execute commands extremely straight forward.  Each custom job inherits from a base class.  Since each job is modeled as a Python object, artefacts from each job can be collected by the specialized class and the collected Python can easily passed from one job to another.

I am pasting the relevant source code below.  Maybe next week, I'll set up a GitHub account.
class Barf(object):
    """ Build and Run Flow """

    def post_order(self, node):
        """ Recursive post-order tree traversal """
        if not node:
            return
        for child_name in node['requires']:
            child_node = self.comp[child_name]
            self.post_order(child_node)
        if node['visited'] == 0:
            self.flist_obj.append(node)
            node['visited'] = 1

    def load_comps(self, top_node):
        """ Load components from yaml files """
        self.comp = {}

        for root,dirs,files in os.walk(os.environ.get('WS')):
            for file in files:
                if file == "comp.yml":
                    full_path = os.path.join(root, file)
                    stream = yaml.load(open(full_path))

                    # use list comprehension (!)
                    stream['files'] = [ root+'/'+x for x in stream['files'] ]

                    name = stream['name']
                    self.comp[name] = {}
                    self.comp[name]['files'] = stream['files']
                    self.comp[name]['options'] = stream['options']
                    self.comp[name]['requires'] = stream['requires']
                    self.comp[name]['visited'] = 0

        self.flist_obj = []
        self.post_order(self.comp[top_node])
  
class Job(object):
    """ Base class for job object """

    def exec_cmd(self,cmd,wdir=os.environ.get('WSTMP')):
        """ Execute shell command """
        p = subprocess.Popen('cd {0} && {1}'.format(wdir,cmd),stdout=subprocess.PIPE,shell=True)
        (stdout, stderr) = p.communicate()
        if p.returncode != 0: raise Exception("Command {0} failed ".format(cmd))
        return (stdout, stderr)

    def cyg_to_win_path(self,cyg_path):
        """ Convert cygwin path to windows path """
        p = subprocess.Popen('cygpath -w '+cyg_path,stdout=subprocess.PIPE,shell=True)
        return '"'+p.communicate()[0].rstrip()+'"' #--HACK: cygwin

class CleanTmp(Job):
    def execute(self,lib_name='work'):
        self.exec_cmd('rm -rf {0}/*'.format(os.environ.get('WSTMP')))

class RunVlib(Job):
    def execute(self,lib_name='work'):
        self.exec_cmd('vlib {0}'.format(lib_name))

class RunVlog(Job):
    def execute(self,flist_obj):
        files = []
        for obj in flist_obj:
            files +=  obj['files']
        files = [ self.cyg_to_win_path(x) for x in files ]

        options = []
        for obj in flist_obj:
            options +=  obj['options']

        self.exec_cmd('vlog -sv2k5 {0} {1}'.format(' '.join(files),
                                                   ' '.join(options)))

Wednesday, May 28, 2014

Wednesday Night Hack #1

This week, I began the development of basic a front end tool to wrap the functionality of the CAD tools bundled with my LatticeECP3 FPGA development board.  The packaged tools are Aldec-HDL, Synplify Pro, and Lattice Diamond (the FPGA implementation tool).  Active-HDL, by all respects, have a pretty solid GUI, but my preference is to maintain control from the command line.

Rather than implement a custom Makefile library or a traditional Perl script, I chose to develop an API using Python that a user could use to build up their own flow apps.  The idea is to replace configuration files or application-specific dynamic scripting languages with an actual scripting language.  This approach scales better over time.

The first step to this process is project management. We want the ability to manage groups of files.  I call these groups of files components.  Examples of components include RTL unit (Verilog module and its sub modules), UVC (Universal Verification Component), design IP library (RAM library, FPGA megafunctions), etc.

We also want the ability to establish "depends on" relationships between components.  Here is an example component definition written in Python

c.set_name('top')
c.add_file('top.v')
c.add_require('led')
c.add_require('ram')

For a simple project consisting of three components. Top, RAM, and LED where top depends on RAM and LED and LED also depends on RAM.  The flist would need to resemble following.  The script below is able to produce this.  In fact, below is the actual output of the script.

# Component: ram
/cygdrive/d/Projects/system/rtl/ram/ram.v

# Component: led
# Requires: ram
/cygdrive/d/Projects/system/rtl/led/led.v

# Component: top
# Requires: led,ram
/cygdrive/d/Projects/system/rtl/top/top.v

I achieve this functionality by implementing two classes in Python: Component and Go.  Here are some code snippets.  First, I build a tree data structure in process_requires function.  I traverse the tree in function get_flist using a simple recursive algorithm.

Here is a snippet of the source code.  There's no error checking, so YMMV.

class Component:
    """ Exposed to user """
    def add_file(self, file):
        self.files.append(self.root_dir+'/'+file)

    """ Exposed to user """
    def add_option(self, option):
        self.options.append(option)

    """ Exposed to user """
    def add_require(self, require):
        self.requires.append(require)

    def get_flist(self):
        flist = '#' * 80 + '\n' # 80 character comment line
        flist +=  '# Component: ' + self.get_name() + '\n'
        if self.requires:
            flist +=  '# Requires: '+','.join(self.requires)+'\n'
        if self.files:
            flist +=  '\n'.join(self.files) + '\n'
        if self.options:
            flist +=  '\n'.join(self.options) + '\n'
        return flist

class Go:
    """ Load all component under root_dir """
    def load_comps(self):
        for root,dirs,files in os.walk(self.root_dir):
            for f in files:
                if f == 'comp.py':
                    c = Component()
                    c.set_root_dir(root)
                    execfile(root+'/'+f)
                    self.component[c.get_name()] = c

    def process_requires(self,node):
        for require in node.requires:
            child_node = self.component[require]
            self.process_requires(child_node)
            node.child_nodes.append(child_node)

    def init(self):
        self.load_comps()
        self.process_requires(self.get_top_node())

    def get_flist(self):
        self.visited = {}
        self.do_get_flist(self.get_top_node())

    """ Traverse tree using recursive post-order search """
    def do_get_flist(self,node):
        for child in node.child_nodes:
            if not child.get_name() in self.visited:
                self.do_get_flist(child)
        if not node.get_name() in self.visited:
           print(node.get_flist())
           self.visited[node.get_name()] = 1 # mark visited

Tuesday, December 27, 2011

Vim Setup for Python

I have used a number of modern IDEs for software development.  For example, I think it is neat how one can draw up UML diagrams and generate code from them (or vice versa).  Integration with the-more-complex revision control systems is also nice.  What I dislike is vendor lock in loss of control of the development flow.  I could speak more about this, but it is a different topic than what I'd like to cover today.

I like the Vim editor because of it's point tool nature.  It does it's job and it does it well.  It is well supported by the community, it is available on virtually every Linux installation (and Windows), and it is extremely customization.

A number of blog posts have been written that describe their settings for configuring Vim to edit Python files. This will be yet another.  The default configuration for Python in Vim leaves to be desired.  Fortunately, there are a few easy steps one can take to enable a streamlined experience.

The first step is to make sure to enable the Vim filetype plugin using command filetype plugin indent on in your $HOME/.vimrc.  The following site has more information about why to use this plugin.  To summarize, it allows the user to move filetype-specific commands outside of their main configuration file, reducing clutter in the file.


For indentation, first create a Python-specific configuration file $HOME/.vim/ftplugin/python.vim containing the following lines:

setlocal tabstop=4
setlocal softtabstop=4
setlocal shiftwidth=4
setlocal textwidth=79 " to comply with PEP-8 style guide
setlocal smarttab
setlocal expandtab

Eric McSween developed a Vim plugin for smart indentation of  Python code.  The utility is available here: http://www.vim.org/scripts/script.php?script_id=974.  Installation is straightforward, simply download the latest version of the script and copy the file to $HOME/.vim/indent/python.vim.  Use of the filetype plugin, as described above, will automatically enable the utility.

Last, Dmitry Vasiliev developed a Vim plugin for Python syntax highlighting.  The utility is available here: http://www.vim.org/scripts/script.php?script_id=790.  To install the plugin, download the latest version of the script and copy the file to $HOME/.vim/syntax/python.vim.  Enable the plugin by adding command syntax on to your $HOME/vim.rc file.

I recommend the use of a static code analyzer for anyone developing new code.  Pylint is my analysis tool of choice.  It is typically available as a binary package in most Linux distros - i.e. may need to talk to your sys admin.  There is some overhead of using code analysis in the short term, but long term, you will benefit from a code base with that looks pretty and smells less.

For integration with my editor, I use the following plugin: http://www.vim.org/scripts/script.php?script_id=891.  Installation of the plugin and requires the user to copy the downloaded script to directory $HOME/.vim/compiler

Tuesday, September 13, 2011

From Perl to Python - Preliminaries

My first real job in the tech industry was for a small graphics design firm in Montreal.  I was 18 years old.  They hired me as a part-time web programmer.  The development environment that the company used at the time was a LAMP-like stack using Perl/CGI rather than PHP.  This was before the year 2000.  I will admit that I was not very experienced, so it goes without saying that I wrote a ton of ugly Perl code - but heck - it got the job done!

My experience with Perl grew over the years.  In graduate school, I wrote a handful of Perl scripts to stitch together a number of point tools that, together, made up a complete design/verification/FPGA implementation environment for the processor and SoC design that was the subject of my thesis.

As a hardware design verification engineer and as an EDA application engineer, I continued to write Perl scripts to do everything from log post-processing and checking, regression mining, to developing complete  front-end verification tools.  Perl became my #1 - it got the job done!

Inevitably, I also helped maintain a number of existing scripts that were written by other engineers.  Yes, Perl got the job done, but there were times that it was at the expensive of my sanity.

My experience so far has shown that while it is easy to write scripts using Perl, the result is often difficult to maintain code.  Engineers tend to treat the language as a simple shell script enhanced with regular expression matching.  Most understand basic data structures such as arrays and hashes, but only a select few can grasp more complex data structures (e.g. hashes of arrays), Perl references, one liners, OOP, exception handling, CPAN modules, etc.

Now, I realize that my opinion above could be considered by some as flamebait (especially because it is not well substantiated).  You can write poor code in any language, not just with Perl.  Being an experienced programmer helps, but I contend that the language itself does not do its part in encouraging the programmer to write good code.  Sure, you can train programmers to write better Perl code, but good luck with enforcing coding guidelines.

The purpose of this first of a series of blog posts is to introduce an alternative programming language for tasks that are typically relegated to Perl.

I first used Python in college to write a Blender 3D export script for my undergraduate computer graphics course.  My initial impression of the language was not very favorable: the use of tabs for indenting, no semi colons at the end of the line, etc.  I did not understand the hype.

I started to take notice of the language lately for a few reasons.  First, there was a colleague of mine at Sun Microsystems that could not stop raving about the language!  There is also the Google factor.  If the crack computer scientists at Google have selected Python as the company's scripting language of choice, shouldn't I take a closer look?  Finally, the Python community is very committed to coding guidelines and style.  There are great tools available for static source code linting (pyflakes, pylint) and for enforcing the style guide (pep8).

With the above in mind, I have decided to write a few posts on the language from the point of view of an intermediate- to advanced- Perl user.  Here are some of the topics that I will cover in the future, stay tuned!
  • Vim Setup
  • Parsing Command Line Arguments
  • Program Flow and Exception Handling
  • Logging
  • File I/O
  • Regular Expressions
  • OS
  • Lists and Dictionaries (and Dictionaries of Lists)