Wednesday, June 18, 2014

BARF on BitBucket

I created a GIT repository on for BARF.  I've committed the initial version of the flow and an example script.  Please contact me if there are questions.  More later.

Wednesday Night Hack #4 - Commentary on SystemVerilog-Design Interface Example

I did not get around to synthesizing the example in Chapter 10 of Sutherland's book (refer to my previous post).  Instead, I have some comments about the published example. Most of my time tonight was spent preparing BARF for release.

To keep this post concise, I won't post snippets.  A copy of the source code can be found here.

Comment #1 : Modports TopReceive and CoreReceive are duplicated in interface Utopia.  I don't understand the decision to duplicate the modports.  The SVTB interface should describe the Utopia interface.  The Utopia interface is a set of signals and those signals can either be seen from point-of-view of receiver or transmitter.

Comment #2: The author made an interesting decision to model the register file using interface LookupTable.  I understand what he's trying to do -- DNR, do not repeat (yourself).  His control logic calls a function to read or write a memory array.  The function and memory array live in the interface rather than the module.  If you expect the synthesis tool to infer memories from RTL code, then it's a viable option.  If you need the ability to manually instantiate library cells for memory, then it's not so viable - you cannot instantiate modules within an interface.  I am somewhat weary about the approach for the following reason.  Yes, you can start with synthesis-inferred memory arrays, but what happens when/if you need to move to custom cells?  Then, the interface may not be able to serve its original purpose.

Comment #3: Why didn't author have the main FSM use signals in interfaces rather than drive temporary signals which are then assigned to the interface nets.

bit [0:NumTx-1] Txvalid; // FSM 'drives' this

for (TxIter=0; TxIter<NumTx; TxIter+=1) begin: GenTx
  assign Tx[TxIter].valid = Txvalid[TxIter] // assign to ifc

Comment #4: Combined blocking and non-blocking assignment in rx_valid_state

This example is all kinds of confusing.  I had to turn to StackOverflow to understand what was going on and I'm still not entirely clear how exactly that RTL code gets translated into gates.  The logic basically says to rotate the the round-robin pointer until one of the receivers has a valid cell that is ready to be processed.  Then, the cell is latched and the FSM advances to the next state.  It's an interesting strategy that I assume works because of SystemVerilog's always_ff.

Comment #5: What did I learn about networking?

Not much.  That ATM is a very old protocol.  That ATM is has nearly 10% "cell tax", meaning the header makes up that much of the total size of the cell.  The purpose of the design is to take in incoming ATM cells and arbitrate among receivers.  After a receiver is selected, the register file is accessed update the to-be-transmitted cell's VPI and compute it's new header error control (HEC) value.  Finally, there are two state machine that receive and transmit the cells according to Utopia protocol.

Tuesday, June 17, 2014

Two New Technical Books

I have two new technical books on their way to bookshelf :
I read through the table of contents of the first book and couldn't pass up its sub-$40 price tag (and 728 pages).  I'm especially interested in the second book.  PCB design is not a sexy field, but I've always been interested in creating something I can touch.  I've researched and found TechShop in San Jose has some equipment that can help support that experiment.  Likely, it's a far cry from the equipment that I could get access to at work, but alas, I want to keep my day job.  I have no concrete plans for a circuit board project right now, but maybe one day ...

Thursday, June 12, 2014

Planning Ahead - SystemVerilog-Design and Networking Hardware

Chapter 10 of SystemVerilog for Design showcases the design of a Asynchronous Transfer Mode (ATM) user-to-network interface (UNI) and forwarding node.

I have no idea what that means.  Besides configuring home routers and configuring Windows / Linux PC to access them, I have next-to-no background in low-level computer networking.

The chapter also claims to summarize the SystemVerilog-Design concepts presented in the book.

I think it will be an interesting challenge to review the design presented in the book and attempt to implement it on my FPGA device.  It will be a good refresher for SystemVerilog-Design and I will learn a little something about networking hardware design.

I am hopeful that today's (2014) bundled synthesis tools are up for the challenge.

Wednesday Night Hack #3 - BARF is done

I am done working on my Python-based build and run flow.

While doing this hack, I took a small detour and evaluated SCons for use with EDA tools.  I was successful in understanding the declarative nature of the tool and even wrote some custom builders, however, it quickly became clear that I was trying to fit a square peg into a round hole and abandoned the effort.  I was hoping that I could adopt a tried-and-tested solution and would love to hear from others who have been successful, if any.

The feature list for BARF is limited.  It has the ability to collect groups of files into components and provides a wrapper to execute commands.

Each component is specified using a YAML file.  The component contains a list of files, options, and requires.  Requires are needed to indicate parent-child relationship between components.  For example, a block may require a ram component or the chip-level wrapper requires all block components.
name: led
files: [led.v]
options: []
requires: [ram]
I will point out that the use of YAML is a reversal from the previous blog post where I experienced with making the mechanism to declare a component be a Python script itself.

I kept the API for execute commands extremely straight forward.  Each custom job inherits from a base class.  Since each job is modeled as a Python object, artefacts from each job can be collected by the specialized class and the collected Python can easily passed from one job to another.

I am pasting the relevant source code below.  Maybe next week, I'll set up a GitHub account.
class Barf(object):
    """ Build and Run Flow """

    def post_order(self, node):
        """ Recursive post-order tree traversal """
        if not node:
        for child_name in node['requires']:
            child_node = self.comp[child_name]
        if node['visited'] == 0:
            node['visited'] = 1

    def load_comps(self, top_node):
        """ Load components from yaml files """
        self.comp = {}

        for root,dirs,files in os.walk(os.environ.get('WS')):
            for file in files:
                if file == "comp.yml":
                    full_path = os.path.join(root, file)
                    stream = yaml.load(open(full_path))

                    # use list comprehension (!)
                    stream['files'] = [ root+'/'+x for x in stream['files'] ]

                    name = stream['name']
                    self.comp[name] = {}
                    self.comp[name]['files'] = stream['files']
                    self.comp[name]['options'] = stream['options']
                    self.comp[name]['requires'] = stream['requires']
                    self.comp[name]['visited'] = 0

        self.flist_obj = []
class Job(object):
    """ Base class for job object """

    def exec_cmd(self,cmd,wdir=os.environ.get('WSTMP')):
        """ Execute shell command """
        p = subprocess.Popen('cd {0} && {1}'.format(wdir,cmd),stdout=subprocess.PIPE,shell=True)
        (stdout, stderr) = p.communicate()
        if p.returncode != 0: raise Exception("Command {0} failed ".format(cmd))
        return (stdout, stderr)

    def cyg_to_win_path(self,cyg_path):
        """ Convert cygwin path to windows path """
        p = subprocess.Popen('cygpath -w '+cyg_path,stdout=subprocess.PIPE,shell=True)
        return '"'+p.communicate()[0].rstrip()+'"' #--HACK: cygwin

class CleanTmp(Job):
    def execute(self,lib_name='work'):
        self.exec_cmd('rm -rf {0}/*'.format(os.environ.get('WSTMP')))

class RunVlib(Job):
    def execute(self,lib_name='work'):
        self.exec_cmd('vlib {0}'.format(lib_name))

class RunVlog(Job):
    def execute(self,flist_obj):
        files = []
        for obj in flist_obj:
            files +=  obj['files']
        files = [ self.cyg_to_win_path(x) for x in files ]

        options = []
        for obj in flist_obj:
            options +=  obj['options']

        self.exec_cmd('vlog -sv2k5 {0} {1}'.format(' '.join(files),
                                                   ' '.join(options)))

Wednesday, June 4, 2014

Wednesday Night Hack #2 - BARF

Tonight, I continued to work on the front end tool described in last week's blog post:  I found a clever name for the script: barf (build and run flow).  It's too late in the evening to post details, so I will be brief.  Besides the name change, I implemented an object oriented build pipeline.  The user can describe the stages of their build and run flow using Python objects.  The execution of the objects is controlled by a centralized class.  While it may be outside of the scope of this (yet to be determined effort), this would allow the jobs to be executed in parallel.  I also added a mechanism for a job stage to pass information from one another.  An example use case is for the synthesis step to pass an object to the place and route step.  Finally, I created two build stages.  The first to create an Active-HDL work library (vlib) and the second to compile Verilog file (vlib).