[Python-au] my resume

Jack Andrews effbiae at gmail.com
Tue Aug 7 06:09:04 UTC 2007


Dear python-au,

My resume is appended.  I'm available for python work immediately, or
in 3 weeks time (I have the option of a 2-week contract starting at
the end of the week)


Jack.




Name:   Jack F Andrews
Email:  jack at ivorykite.com
Date:   2007 07 24
DOB:    1974 03 21
Street: 4/1 Ashmore Ave
City:   Mordialloc 3195
Mobile: 0408 300 591
Phone:  03 9588 1624



                      Jack Andrews
                      ------------


Computing has been my hobby for most of my life, and I have
made it my profession for a decade.

After graduating, I started in debugging a C++/MFC ETL tool
and, after a year, went on to lead a team of four in new
product development.  The new product was a new ETL tool for
STR to load data into their next generation DBMS.  The
application was based around JDBC: we developed a JDBC
driver for many text file formats as well as a driver for
the new DBMS. Jython (Python in Java) played a key
role in the prototyping and final application.  This new ETL
tool used MFC/C++ for the front end and Java for the back. I
used C/C++ and Java to join STR's new DBMS to a JDBC
interface.

Later, I travelled the world in pre- and post-sales
performing ETL for potential customers in the government
sector such as statistics bureaus around the world, from
databases as diverse as Oracle, SAS, Sybase, SQL Server and
CSV and fixed length text files.  On top of complex ETL
tasks, I implemented OLAP reports and ad-hoc queries that
the analytical DBMS could produce.  One interesting client
was the FBI: I sanitized their data and showed them some
business intelligence queries that could improve their
efficiency.  I used SAS, Perl, C/C++ and Python and our
newly built ETL tool for the majority of my work.

For a change, I worked breifly at Ericsson until the dot-com
crash.  Here I optimized an HTTP routing system by moving
the routing from user-space to the kernel using a kernel
module (using C).

Then it was back to STR for more OLAP support work.  Later,
I researched alternate technologies for STR's next steps
using CORBA, C++ and Python.

In a more independant role, I wrote a system for Bosch for
their fuel injecting testing rigs and, most recently, I
contracted to Intrepid Travel to audit their software
development -- checking the health of a development project
that had gone bad.

I then went to Boeing as a data migration developer where
I spent a year implementing Maximo.

I enjoy leading small teams and I use objective measures to
set goals and predict outcomes.


Skills:      Developer(9y),Team Lead(2y),
             ETL-OLAP-BI-Reports(6y), Pre-/Post-sales(2y)
	     Auditor(2m)

Languages:   Python(6y),C/C++(5y),Java(3y),XML(1y)
             PL/SQL(3y),HTML-Javascript-CSS(2y),
             sh:bash-ksh(5y),Perl(1y),SAS(1y),Erlang(6m),
             Lisp(home),K(6m),J(home)

OS:         *n?x:Linux(7y)-Solaris(3y)-AIX(3y),
             Windows(9y)

Libs:        sockets(2y),CORBA(3y),regex(5y),MFC(2y),
             COM(6m),JDBC(4y),ODBC(4y),CGI(3y),
             SQLObjects(1m),WebKit(1m)

Tools:       RDBMS[Oracle,Sybase,SQL Server](5y),
             Visual C++(4y),Access(2y),Excel(5y)
             [vi,gcc,make,gdb,CVS,ddd,emacs](5y),
             [LAMP,MySQL,PHP,Python](1y),TOAD(1y)

Techniques:  ETL-OLAP-BI-Reports(6y),
             Multithreading/Distributed/Concurrent (2y)
	     Rapid/Agile Application Development (6y)


Period:   Aug 2006 -
Summary:  Maximo Data Conversion and Integration Developer

Boeing, Port Melbourne

I am contracted to Boeing to migrate data from their existing
asset management systems to Maximo, an IBM product.
The role was customer focussed, identifying all their data
and processes and the cleansing and transforming actions they
thought most appropriate.  The role was made more challenging
as there were two sites using six systems.  We had to guide
all the users into a common view of their different systems.

I developed reports in SQL and Actuate and integrated systems
using MEA and XML over HTTP

The main tools I used were Oracle PL/SQL (with TOAD), shell
scripts, Excel, Access, Actuate and Python.

Keypoints

  - Maximo migration using Oracle PL/SQL and Python

  - System Integration using Maximo's MEA and XML over HTTP

  - Extracting requirements for reporting for the customers.
    and developing reports using SQL and Actuate


Period:   Jul 2006 - Aug 2006
Summary:  Consultant Programmer and Auditor

Intrepid Travel, Fitzroy

Originally, Intrepid wanted a LAMP+Python programmer to
assist and bring a project to conclusion.  On the same day I
was interviewed, the project was suspended after missing
milestone after milestone. I proposed that I be contracted
to assess the health of the development and the team and
report my findings.  The report was not favourable,
primarily because they had accumulated 17MB of source code
(that's the size of the text in four bibles) for a
straightforward forms based web application.

Keypoints

  - Working with LAMP + Python

  - Assessed how well the intranet and internet developments
    met the business' needs.

  - Acquired case studies of how the development team
    interfaced the business

  - Collected metrics such as LOC over time and the link
    with milestones, bugs found and fixed over time,
    enhancements, profiling the code to identify efficiency
    problems

  - Weekly reporting to upper management and daily chats
    with the project manager to form a picture of the
    problem and solutions.

  - Maintained a working relationship with team members
    while giving the project a negative assessment.



Period:   Jan 2005 - Apr 2006
Summary:  Consultant Developer (as IvoryKite)

My work as IvoryKite predominantly involved working by
myself with a single contact point into the client's
business.

Bosch.  Stuttgart, Germany

My brother was working at Bosch in Stuttgart and he told me
about the work his laboratory was doing which sounded
repetitive.  I was able to automate his work by interfacing
with their oscilloscopes, analysing the data to isolate
'points of interest' and charting the result.

 see http://ivorykite.com/ for an example chart.

Keypoints

  - I developed the solutions in Python on Linux and
    deployed on Windows.  The publication quality charts and
    graphs were made with matplotlib. Optimisation of the
    analysis process was written in C/C++ and communication
    with the 'scope was done via RS232 and pySerial.  The
    GUI was developed in wxWindows.

  - Most communication was through email which provided the
    development documentation for the project.  Most
    features were implemented within a few days of them
    being requested.


Space-Time Research, Camberwell.

I have worked at STR in different roles and I continued my
good relationship with the company in a Consultant research
capacity.

Keypoints

  - I worked mostly at STR's Melbourne office and reported
    to the General Manager, Development.

  - The major goal was to adapt STR's OLAP CORBA interface
    to ODBC (talking to Oracle,SQL Server, MS Access,Sybase)
    to prove the concept.  Prior to this work, STR's
    analysis UIs only talked to STR's proprietary OLAP
    engine.

  - I undertook the work on Linux using omniORB, C++ and
    Python with omniORBPy and mxODBC.  The prototype
    operated as a CORBA server with the same interface as
    STR's OLAP server.

  - The majority of the work was translating the UI requests
    into SQL.

  - The prototype succeeded in proving the concept.  Many
    functions of STR's software were provided with the
    remaining functions left for future development.


Period:   Oct 2002 - Dec 2004
Summary:  Timeout

Having worked hard, it was time to take a break.  I also had
some health problems and it helped my recovery to be relaxed
and away from work.  The time off gave me more time to
tinker about with computing and to renovate our house.


Period:   Jun 2001 - Oct 2002
Position: ETL & OLAP Consultant, Space Time Research (STR)
Summary:

STR is a company consisting mostly of programmers.  It's
product suite is built around a bespoke analytical database
system which is sold around the world.  The main customer
base is statistics agencies in Europe, North America, A/NZ.

Travelled to US frequently to support STR's product
performing ETL and OLAP at the Department of Commerce,
Bureau of Census. Also, pre-sales to Canada and Poland.
Post-sales to Canberra in government such as Australian
Taxation Office.  Here, I taught myself Perl.

I optimised STR's analytical database server for RS2000 on
AIX.

Investigated new direction for product suite, promoting a
replacement for STR's proprietary database system.



Period:   Feb 2001 - May 2001
Position: Analyst/Programmer: Ericsson (Lodbroker)
Summary:

Lodbroker was a spin off company from Ericsson.  The product
was a proxy-like load balancing web server.  I explored a
way to make it more efficient by means of a kernel module.

Experienced the dot.com crash here.



Period:   Jan 1999 - Nov 2000
Position: Team Lead and Senior Engineer, OLAP and ETL: STR
Summary:

Lead a team of four to develop a new ETL and cleansing tool.
This was part of the next- generation product suite.

Supported the product in Europe and US by loading customer
data into STR's database using Python, Java and our new ETL
tool.

I helped the developers of the database server to get their
code compiling on AIX's xlC compiler.  Eventually, this
required close cooperation with the IBM compiler development
group to debug their C++ compiler, and idebug debugger.




Period:   Sep 1997 - Jan 1999
Position: Engineer: Space Time Research
Summary:

Supported and maintained ETL tool for STR's database
software. Win32 C++.




Period:  - Sep 1997
Degree:  BSc: Melbourne University
Summary:

Graduated from the University of Melbourne in Computer
Science and Maths with 80% average.

Period:  1996
Summary: Moved to Melbourne to study

Period:  1995
Summary: Studied at University of Tasmania: science.

Period:   Mar 1994 - Dec 1994
Summary:

Working holiday in Canada.  Started the year temping,
finished the year developing a forms-based app for an
importer/shipping company.


Web presence:

   http://www.google.com.au/search?q=effbiae
                   (my login name is effbiae)

   http://ivorykite.com/



           "Only short programs have any
                 hope of being correct."
                        (Arthur Whitney)


Appendix - Python Skills

I was first exposed to Python in January 1999 and since then
I have used it whenever I have the freedom to do so.  It is
a joy to work in this language.

Here are the details of my 6 years with Python in the
commercial world:


Jan 1999 - Nov 2000

I led a team of four to build a tool designed to import data
into a proprietary database engine from various data
sources.  One of the requirements was that it run on AIX,
Solaris and Windows.  The importer had to extract data from
the mainstream DBMSes and pull in data from peculiar text
formats.

To solve this problem, JDBC was the obvious choice as all
the DBMSes provided a Java driver and ODBC was not suitable
for Unix.  So we wrote a JDBC driver for the text format and
for our proprietory DBMS.  This left Python to glue
everything together or, more precisely, Jython was used as
the glue.

A GUI was written in Jython for the user to choose the
tables and columns to be imported and specify any data
cleansing actions.  I wrote the engine that did all the
work.  This engine had to run as efficiently as possible
because customers routinely had gigabytes of data.  How to
do this in Jython?  Well, I wrote python code that produced
a Java program that was compiled and executed.  This
provided the fastest possible implementation to move data
from a JDBC source to another JDBC target.


Jun 2001 - Oct 2002

I travelled extensively supporting the application described
above.  Most of the work involved using Python to extract
data from disparate sources, describing them so that the
JDBC driver that we wrote for text could provide the data to
the application.


Jan 2005 - Jan 2006

In the first half of 2005, I wrote a CORBA server in Python
using omniORBpy.  This server generated SQL as directed by
the client that was sent to an ODBC connection and the
results collated.

In the second half of 2005, my brother was working at Bosch
in Stuttgart and he told me about the work his laboratory
was doing which sounded repetitive.  I was able to automate
their work by interfacing with their oscilloscopes,
analysing the data to isolate 'points of interest' and
charting the result.

I developed the solutions in Python on Linux and deployed on
Windows.  The publication quality charts and graphs were
made with matplotlib.  Optimisation of the analysis process
was written in C/C++ and communication with the 'scope was
done via RS232 and pySerial.  The GUI was developed in
wxWindows.

Most communication was through email that provided the
development documentation for the project.  Most features
were implemented within a few days of them being requested.


Jul 2006 - Aug 2006

Originally, Intrepid wanted a LAMP+Python programmer to
assist and bring a project to conclusion.  On the same day I
was interviewed, the project was suspended after missing
milestone after milestone. I proposed that I be contracted
to assess the health of the development and the team and
report my findings.  The report was not favourable,
primarily because they had accumulated 17MB of source code
(that's the size of the text in four bibles) for a
straightforward forms based web application.


Aug 2006 -

I am contracted to Boeing to migrate data from their existing
asset management systems to Maximo, an IBM product.
My primary tool is Python.


Appendix - C/C++ Skills
I studied Computer Science and Maths at University of
Melbourne. I completed a third year subject in Software
Engineering during which I worked in a small team and our
task was to add HTML TABLE support to the Lynx web browser.
We were one of only two groups who actually made it work
and we got 24/25 for the project.

Then, my first job at STR (a BI/OLAP tool maker) was
maintaining an ETL tool in MFC and developing a converter
from their old database format to the new format.  The
converter I wrote in C++ and designed it using the
techniques used in the containers of the STL.  I developed
data access routines that considered a table to be a
container of rows for which an iterator could be given and
incremented by '++'.

I also maintained the ETL tool which required a deal of
refactoring.  I described working on this software as
hacking through dense scrub, going in circles and finding the
path that had been cut had already grown back.  It wasn't
pleasant, but I learnt, through other people's mistakes, how
software can go bad, even before it was released. This led
on to my role as Team Leader for the redevelopment of this
tool.

As lead on the new ETL tool, I made the decision to choose
JDBC for the extraction and load processes.  This decision
fell out of the requirement for a cross platform product: it
had to run on Windows, Solaris and AIX.  ODBC was too
windows-centric and Java promised the same (or significantly
the same) behaviour on all the targetted OSes.  Java's
performance at the time was considered acceptable, and it
only got faster by the time the tool was released.  Most of
the time spent on the CPU was in the database core written
in C++.

My team's task was to build a basic SQL layer on top of the
C++ core (which had no query language, as such).  I directed
and directly contributed to this work in C++.  This
component was the most time consuming and complex task on
our path to release.

The GUI was written in MFC in Visual Studio to look best on
Windows, and CORBA was chosen to mediate between the GUI and
the engine.  I developed this interface along with the code
to glue the GUI onto the CORBA interface all in C++.
Meanwhile, the 'Server Team' (those developing the OLAP
engine) were running into trouble...

The Server Team were developing using Visual Studio and
porting to Solaris and AIX was a task assigned to one
person.  It was decided to use gcc on all Unix platforms to
minimise the trouble of multiple compiler targets.  While
this worked well on Solaris, at the time, the port of gcc to
AIX was unreliable.  Each week, I would see the 'porting
guy' flounder around trying to partition source files into
chuncks that the compiler could manage without running out
of memory, and then when it did compile, it would dump core.
To add to the problem, gdb would fail while trying to
isolate the bug.

I told the development manager that this needed urgent
attention, and that I had some spare time as the ETL tool
was running ahead of schedule.  I took a hold of this
problem and, after a number of conference calls with the gcc
guy on AIX (who was in IBM's employ) and with the xlC
(Visual Age) compiler support team, it became clear that gcc
was not going to do the job, and IBM pledged resources to
get the code compiling and running on AIX with xlC.  We
received the latest xlC by CD, and I rapidly started
raising PMR's (IBM's problem management request) against
xlC.  Initially, I had to port STLport to xlC, and a lot of
the code I ported found bugs in the compiler which were
promptly fixed by the team in Toronto.  Once I had STLport
working, I had to address all the myriad of ways that the
server used templates (and ways that the compiler didn't
like).  Eventually, a compiler support person came to
Melbourne and he helped me isolate compiler bugs and get
speedy response from his colleagues.  Then, AIX's new
debugger (idebug) wouldn't work...  To cut a long story
short, I ported the server to AIX and I learnt that C++ is a
complicated language that needs to be treated with some
respect, particularly in relation to templates.

In 2001, I left STR for Ericsson and the Lodbroker project.
Lodbroker was a product that provided the Load Balancing of
Web Servers.  Lodbroker was a mediator that decided which
backend server should receive an HTTP request.

At the time I joined, they were worried about the product's
performance.  I attacked this issue by looking at writing a
kernel module that did the routing work in-kernel.  I based
my work on IPfilter and initial results showed a 2x speedup.
Unfortunately, Lodbroker was wound up in May 2001.

After taking time out to renovate our house, I went back to
STR to help them on a research project to evaluate the
potential to use their products as an OLAP UI with Oracle or
other RDBMS at the backend.  I wrote this in Python,
optimizing in C++ and using CORBA to glue the UI onto ODBC
on a windows platform.

More recently, I developed a tool to help Bosch analyse
results from oscilloscopes.  Again, I used Python with C
interfacing hardware with RS232.

Now, in my spare time, I am writing an interpreter and compiler
for Scheme (a Lisp) in C.



More information about the python-au mailing list