This section contains an informal essay on my career. The essay attempts to answer questions that are often asked in professional interviews, it covers 18 years of my professional experience and is therefore quite lengthy. If this is more information than you are looking for, please go to the Resume section.

I come from a family of engineers, which is perhaps the primary reason I decided to study Systems Engineering, graduating in 1994 from National University of Ukraine in Kiev, Ukraine. My major was CAD in the Electronics with focus on ways to automate design and testing of various digital components. This is five and a half year program, which, among other things, requires students to complete a few formal electronics designs and defend a thesis to graduate.

My carreer began in 1991 with part-time (as I was still attending the university) Developer position at power plant design company Ukrenergoprom. This company of roughly 500 employees was looking for someone to automate the preparation of bills of materials, which was a manual task at that time. This involved designing and implementing a way to support flexible general classification, maintain equipment and materials records under this classification, and compose, store and process multiple specifications for structured facilities designed by the company. I used Clipper/dBase to develop the solution under MS DOS and stayed with the company for 4 years, expanding the system to suit the needs of various departments. Clipper was a great tool but it lacked an IDE and it was impossible to buy a third-party one in Ukraine. I was studying OOP at the university around the same time, and decided to undertake the project to develop such an IDE with Turbo C++. It was a functional equivalent of MS Edit with an ability to launch an external batch file, which will launch the Clipper compiler and then, provided there were no errors, launch the executable you were working on. This project allowed me to learn a great deal about practical application of Object Orientation and polymorphism. Although there were times in the early 90s when it seemed the country just might pull it off, by 1994 the economic and political situation had deteriorated so much that I decided to immigrate to Canada under the government program for skilled professionals - it was time to look for work overseas.

Arriving to Canada in August 1995 in the midst of recession, I got a job by November developing off-the-shelf financial planning software for RAM Technologies, a small software vendor with only five employees. The package was called Wealth Creator and is still available, albeit in repackaged form, from Investors Source, the company that eventually acquired rights to it. It allowed a personal financial planner to perform over a hundred financial calculations, generate statements of Income and Net Worth, Retirement Plans, etc. I used Visual Basic 4 to develop a modern UI for the package, while continuing to store customer data in flat files to minimize our dependency on third party software (the product was shipping on 3 diskettes). Other challenging initiative included developing financial calculators using the Component Object Model (COM) which was just released by Microsoft and for which very limited tools available on the market. The company was using Borland products at the time which prompted me to use the Borland OCF library to develop the components. By 1997, having completed a few releases of Wealth Creator, I felt that I had outgrown the company and was ready for another challenge.

The challenge came in November 1997 under the name of Benton Associates Inc., the company of about 30 software consultants headquartered in Toronto with offices in New York, Boston and London. Benton Associates specialized in consulting on FAME. FAME is a combination of time series database and runtime 4GL environment with features designed specifically for the financial industry. Our typical customer would purchase FAME license, acquire data feeds from data vendor, and hire Benton to develop the data loader and specific analytical tools required to support their business. Projects ranged from two weeks to three months and consisted of the design and development of small software packages in FAME 4GL, MS Excel, C++ and Perl to manipulate and perform analysis of equities and derivatives data. The packages were developed to run on Sun Solaris or Windows NT operating systems and at times involved cross-platform development. This experience of working directly with demanding broker/trader customers and delivering high quality under time pressure has proven invaluable for my development as a consultant. In addition, I was assigned to work on Benton's Historic Market Data Risk (HMD Risk) product which handled acquisition, validation, maintenance and distribution of historic market data. The main purpose of the product was to clean market data from the irregularities caused by invalid data points and sudden market spikes before feeding it into a VAR calculation engine. Benton Associates had about 10 customers using HMD Risk at the time, the development was an ongoing multi-year effort, which prompted me to start exploring UML, RUP, Design Patterns and other ways to approach large scale software development in a "civilized" manner. Unfortunately, my view on organizing the development of the product did not align with that of Benton's management, which was treating product development in the same manner they were treating a two week consulting project.

This time I was looking for position at a company with ample opportunities to learn the handling of large development projects involving specialized teams, and, of course, a chance to work on such projects. My agent referred me to SRG Software, a consulting company that was the first Microsoft Solution Provider registered in Canada (and would receive Microsoft Canada Partner of the Year Award in 1999). I went for an interview, immediately "clicked" with their Chief Architect Andrew Trossman and was hired on the spot. The company specialized in converting the mainframe terminal or DOS-based clients to Windows for banks and insurance companies. I worked on a few such projects, some of which were "paid for quotes". SRG was an amazing place to work, mainly because almost all of the "between projects" time was allocated to personal development. A few senior types staged internal seminars on advanced topics, the company also offered public courses on COM, MSF and UML. These courses could use some spare hands and I was invited to lead a few sessions on COM. One of my first projects was to develop an IDispatch Adapter providing a stub to access any COM component with a type library (but no support for IDisplatch) from a scripting language. The component was used in course work (one large example of COM inner workings) and for the internal project to develop a generic invocation mechanism over HTTP (this is pre-SOAP time). One of our clients at the time was Financial Models, the company that soon decided it was cheaper to acquire us than continue paying bills. While the acquisition was under way, we were contacted by Canadian Imperial Bank of Commerce to bid on the project to convert their COINS branch server system from OS2 to Windows 2000 Server. COINS branch server ran a set of processes that essentially supported all retail branch banking functionality by handling requests from teller workstations over NetBIOS interface. There were numerous message types to be handled including SNA mainframe transactions. The system ran on branch server in each of the 1,300 branches of CIBC all over Canada. The challenge was to replace the server system without touching the teller workstation, which would continue to happily run under MS DOS on new Windows 2000 teller workstations (replacing PS2 workstations). We ran a quick 3 week project which involved a few interviews with CIBC staff, a lot of source code browsing, outlining the proposed solution, estimation and bid preparation. Our bid was accepted and I was appointed technical lead on the project. As such, I developed a functioning prototype, the architecture and further led the team of 15 on this 9 months, $3 mil project. The client later admitted that for them this was the first software project of such magnitude to complete on time and within budget. This was truly a great experience, perhaps the most rewarding project I have ever worked on! However, by the time we went into production, our consulting business had dried up and I was moved into product development by our new owner, Financial Models.

I started at Financial Models (presently SS&C Technologies Inc.) in July 2001 as the Application Architect, Front Office Products. Financial Models specialized in developing hosted solutions for investment managers in Canada and the US. The Front Office line included Portfolio Modeling (FMC Model™) and Trade Order Management (FMC Trade™). Both products were enterprise level systems built on proprietary message-based platform utilizing IIS, MSMQ, ASP, COM+, MS SQL Server and Active Directory. I was directly responsible for establishing the formal development process in the group of six business analysts and about 15 developers. Prior to my arrival, the requirements were managed as loose bulleted lists in a disparate documents; estimates were prepared by the Development Manager without consulting with developers; most of the developer and QA time was spent talking back to business analysts about things that did not fit the bulleted lists (about 90% of functionality). As you can imagine, the latter was a tremendous waste of time and my mandate from the CTO was to resolve this inefficiency. I started by conducting a few sessions with the Product Management Team, exposing the benefits of formal development processes based on a set of agreed upon artifacts. I then proceeded to introduce the team to use cases as the main vehicle for communicating requirements. I had some templates I developed during my previous consulting assignments, we modified them slightly to better suit the domain and began mapping requirements into structured lists of features and further into well over a hundred use cases. My approach was (and still is) to define a use case as a larger and somewhat abstract block of functionality and then partition it into concrete scenarios to be detailed. A scenario is thus treated as an instance of a use case and the relationship between the two is not unlike that of an object and it's class. To give you an idea on the scope, we were looking at detailing between four to five hundred scenarios, some of which over fifty steps long. At the same time, sessions were conducted with the Development Manager, Development and QA Teams to establish a new estimation and milestone review routine. Assisting product managers in technical writing and overseeing the implementation of the new process over two consecutive releases took up most of my time at the company. The new requirements format resulted in considerable time savings in inter-department communications, while the new estimation process allowed developers to directly participate in establishing coding schedule and eventually brought the actuals within 10% of the estimate. Additional benefits included the ability to support change management and the emergence of the controlled environment with great level of vertical transparency. My engagement continued till November 2002, when most the planned development was completed and the company could no longer afford to keep me on permanent basis.

Armed with the experience gained at Financial Models, I had an easy time persuading my former employer RAM Technologies (which has grown and changed its name to Investors Source) to take me on a six month contract to improve their development practices and help the team to move online their client-server and desktop products. I introduced the Development Team to XML/XSLT, which improved code maintainability and substantially reduced report development time. I also designed and implemented (Visual C++ and Visual Basic) a framework for loading data into the Investor Source' desktop products. I liked consulting on my own, the relative freedom and financial benefit it offered. My next contract with CIBC World Markets came through an ex-colleague from Benton Associates. I was hired as a Senior Developer to improve internal market data processing utilities and aid in the development of a new, integrated system to provide clean data for internal risk engines. I used FAME 4GL, Perl, Unix Shell Scripting, Java and MS SQL Server. After a year with the bank, when most of the projects were completed and my contract term was winding down, I received an offer for a full-time position of Senior Application Designer with San Francisco-based EAM Vendor Spear Technologies. I did have my reservations about returning to full-time, however, the job presented me with the opportunity to return to enterprise level systems, the compensation was suitable and, having weighted all pros and cons, I decided to accept.

Headquartered in Oakland, a company of about 70 employees, Spear Technologies was probably the smallest EAM vendor on the market in 2004. It did, however, manage to previously procure some large accounts: NY MTA, LA MTA and Amtrak to name just a few. Spear's system, Spear 3i is capable of covering all aspects of operations of the above enterprises. Its breadth and depth makes it hard to describe in terms of "from" and "to". It can best be described as a layered stack or a pyramid. The base of the stack is record maintenance for rolling assets, spare parts, facilities and employees. Moving further up, you would find things like Work Order Management, Inventory Accounting, Warranty, Timesheets, etc. Next layer is comprised of modules like Program Maintenance (rotating schedules) and Campaigns (large internal initiatives). All of it culminated with Facility Control and Executive Dashboard, which were under development when I started with the company. The purpose of such a system is to maximize the return on assets through increased asset utilization and lifespan, minimized inventory levels and enhanced workforce productivity. As a member of the Design Team of six, I was responsible for requirements specification on Work Order Management, Inventory Accounting and Warranty modules. On cyclical release basis (typically 4-6 months), my job involved conducting a series of interviews with members of Product Management Team to define a set of new features and modifications to be introduced into the product. I would then take my time to elaborate on features, break them into use cases, write up step-by-step scenarios, prepare UI prototypes and propose adjustments to the core database model. There was a milestone review at this point with representatives of Product Management and Development Teams and the entire Design Team (to ensure consistency). With designs reviewed and approved, the remaining gaps were filled in shortly and the requirements specification was passed to developers for coding. All in all, the routine was quite efficient, although I did suggest a few improvements to the requirements templates which were gladly accepted. Technologically, the company offered client-server system for employees with workstations, stationary kiosks for workshop and "punch-card" employees, Windows CE handhelds with barcode reader for roaming warehouse staff and web-based products for remote employees and executives. The modules I was responsible for spanned all of the above technologies making it quite a challenge to address all the aspects in the same release. One of the exciting projects was the blitz at one of our largest customers to document the inner workings of the Spear 3i to ASRS interface implemented by an external contractor. The customer refused to sign off on the entire project without documentation on this part, while the complexity of the interface was such that the contractor could only bluntly state that it "can never be documented, period". It took me 3 weeks of interviews and code browsing to identify, specify and detail about 30 use cases triggered by various events in the Spear 3i system, which resulted in transactions sent to the ASRS. The level of detail and the strict format of the documentation prompted the customer to accept the very first edition of the document and allowed us to obtain the signoff on the entire $21 mil project shortly thereafter. In 2006, the ongoing financial difficulties at Spear caused our VC to put the company up for fire sale. The new owner, Hansen Inc. stalled further development, laid off about half of the staff, closed the office and moved the remaining employees including myself into "work from home" maintenance mode.

I started looking for opportunities through friends and ex-colleagues and soon connected with Alex Nizhniy, the CEO and founder of Houston-based Federal Schedulers LTD. Alex was looking for someone to head the development on his new venture, the nationwide real estate scheduling and feedback system. After a few meetings, Alex was only happy to meet my targets and in July 2006 I started as the telecommuting Director of Development for Federal Schedulers LTD. AccuShow.com went live in November 2006 and served realtors nationwide for 5 years before being acquired by the largest provider of such services in the United States, Centralized Showing Service.