Breaking camp early in the morning on the 4th of July, we made the 8+ hour drive from Lake Shasta to Hood River, OR. Driving along the banks of the Columbia River provided a nice view of two waterfalls, and as we drove I thought about Lewis and Clark, and their most excellent adventure which took them from the interior of the then new country, all the way out West.
We spent two nights at my sister-in-law's house, and had a nice time watching the fireworks display on the 4th. The following day, we drove around the countryside, bought locally made fresh fruit products (jams, pies, etc.), and saw Mt. Hood. At one of our stops, the kids fed a few goats, which they enjoyed.
After two nights, it was time to head further North once again...
Monday, August 28, 2006
Wednesday, August 16, 2006
Porting a Win32 Delphi App to .net - Part 2
When we last left our hero, he was struggling to figure out why the .net compiled engine was so damn slow. When it comes to diagnosing performance issues with an application, the smart programmer reaches for a profiler.
From the wikipedia article:
Knowing what I needed to do, I sought out the tools which would allow me to profile the library. I downloaded Microsoft's CLR Profiler (there are separate versions for .net 1.1 and 2.0). However, after installing it, I could not get it to work.
Next, I searched Google for .net profilers that support Delphi. On of the first ones I spotted was called ANTS, by Red Gate Software. It only supports .net languages, so the profiler wouldn't be able to help me with the Win32 side of our product line, but I gave it a go anyhow - and it worked beautifully and intuitively.
Finally, I investigate a profiler which would work both with our Win32 library and our future .net library. The only one which meets this criteria is AQTime, by AutomatedQA. Unfortunately, to use AQTime, you must have Administrator privileges. I only use my administrator account for system maintenance, so this was a deal breaker for me. It will also be the subject of an upcoming post on dumb-ass software development for the Windows platform.
So ANTS it was! Using the profiler results from the 15 day free demo, I was quickly able to see where a large chunk of time was being spent - and it wasn't in any code I wrote! One of the data structures in the library is a very large array, which is exposed as a property of one of our objects. It seemed as if a call to initialize this large array was occurring way too many times, since only once instance of it should have been created.
Stumped, I turned to ildasm - Microsoft's Intermediate Language DisASseMbler. It allows you to look at the byte code generated by the .net compiler, and see what the heck is going on at a much lower level. Sure enough, in one routine I saw that it was requesting seven local copies of the large array I mentioned above - each of which needed to be initialized, and copied!
Investigating further, I found the cause of the problem. Instead of using the private field names for the large array (e.g. FArray), a property (e.g. Array) with get and set methods was being called, and for some reason this resulted in local copies being created. By replacing Array with FArray, I saw the calculation time plummet from the 2+ seconds observed before, down to around 1/3 or a second.
I cut the calculation time by another 10% using targeted fixes, and then measuring the results. Since then, I've further profiled the library, tuning code most likely to benefit, and not worrying about code which has little impact on the overall time. Just this morning, I reduced they profiling the calculations again. Now, I believe I am at a point of diminishing returns, and will stop profiling and optimizing until a later time.
Next step: see if I can use the library using mono on linux!
From the wikipedia article:
A profiler is a performance analysis tool that measures the behavior of a program as it runs, particularly the frequency and duration of function calls.
Knowing what I needed to do, I sought out the tools which would allow me to profile the library. I downloaded Microsoft's CLR Profiler (there are separate versions for .net 1.1 and 2.0). However, after installing it, I could not get it to work.
Next, I searched Google for .net profilers that support Delphi. On of the first ones I spotted was called ANTS, by Red Gate Software. It only supports .net languages, so the profiler wouldn't be able to help me with the Win32 side of our product line, but I gave it a go anyhow - and it worked beautifully and intuitively.
Finally, I investigate a profiler which would work both with our Win32 library and our future .net library. The only one which meets this criteria is AQTime, by AutomatedQA. Unfortunately, to use AQTime, you must have Administrator privileges. I only use my administrator account for system maintenance, so this was a deal breaker for me. It will also be the subject of an upcoming post on dumb-ass software development for the Windows platform.
So ANTS it was! Using the profiler results from the 15 day free demo, I was quickly able to see where a large chunk of time was being spent - and it wasn't in any code I wrote! One of the data structures in the library is a very large array, which is exposed as a property of one of our objects. It seemed as if a call to initialize this large array was occurring way too many times, since only once instance of it should have been created.
Stumped, I turned to ildasm - Microsoft's Intermediate Language DisASseMbler. It allows you to look at the byte code generated by the .net compiler, and see what the heck is going on at a much lower level. Sure enough, in one routine I saw that it was requesting seven local copies of the large array I mentioned above - each of which needed to be initialized, and copied!
Investigating further, I found the cause of the problem. Instead of using the private field names for the large array (e.g. FArray), a property (e.g. Array) with get and set methods was being called, and for some reason this resulted in local copies being created. By replacing Array with FArray, I saw the calculation time plummet from the 2+ seconds observed before, down to around 1/3 or a second.
I cut the calculation time by another 10% using targeted fixes, and then measuring the results. Since then, I've further profiled the library, tuning code most likely to benefit, and not worrying about code which has little impact on the overall time. Just this morning, I reduced they profiling the calculations again. Now, I believe I am at a point of diminishing returns, and will stop profiling and optimizing until a later time.
Next step: see if I can use the library using mono on linux!
Friday, August 11, 2006
Porting a Win32 Delphi App to .Net
I have been using Delphi since version 1.0. When Delphi first appeared on the market, it blew the doors off Visual Basic 3.0. It was a rapid application development platform which compiled code down to native machine code, and allowed the creation of all new visual components using the same Delphi environment. Delphi used a beautiful object oriented language, Turbo Pascal, which had proven itself in past Borland products. Since everything was compiled into a single EXE or DLL, you avoided the Visual Basic DLL/VBX version hell. Delphi version 2.0 brought 32 bit support, and continued dominance over VB 4.0.
And now, to the present! The company I work for provides a fairly complex loan calculation engine, which is written using Borland Delphi for the 32 bit Windows platform. Lots of numerical calculations wrapped in iterative loops, etc. Since it is a standard, 32 bit Windows DLL, we can provide many different ways for our customers to access it:
However, a situation has come up wherein a potential client who uses C# to develop their application has requested that we port the DLL over to the .Net platform. Why? Because one of their huge selling points is that, "Our software is pure .Net".
I won't go into my initial "WTF?" reaction, and rant against rewriting a perfectly good and accessible calculation engine just for the sake of what amounts to code related religious fanaticism. When it comes down to it, the customer asks, and we do what we can to accommodate.
With this in mind,we looked at a few possible options: rewrite it the 70K+ lines of code in C#, license our source code to the prospective customer and have them rewrite it, or investigate Delphi for .net.
We dismissed option #1 as being too time consuming for a small company, considered option 2 as sub optimal, and have for now gone with option #3. In about a week, I was able to compile a portion of our calculation engine DLL using Delphi for .Net.
And then I ran a test calculation... and waited... and waited. It took 2 seconds to complete a simple calculation which would have been done in the merest fraction of a second using the native Win32 DLL. We are talking about a performance differential of around 99%. Le ouch!
So now I need to look at profiling the .Net assembly, and try and figure out what the hell is causing the problem. The .Net platform could not be this horrible slow with numeric calculations, can it? It must b some sort of Borland.Vcl.dll .net implementation issue, right? Right!?!?!?
Gahhhhhhhhhh! If anyone has any ideas and/or suggestions, I'm ready and willing!
And now, to the present! The company I work for provides a fairly complex loan calculation engine, which is written using Borland Delphi for the 32 bit Windows platform. Lots of numerical calculations wrapped in iterative loops, etc. Since it is a standard, 32 bit Windows DLL, we can provide many different ways for our customers to access it:
- The standard LoadLibrary() / GetProcAddress() win32 routines.
- We provide a wrapper class for those developing on the .Net platform in vb or C#.
- If using Java on Windows, we even provide a Java wrapper class which accesses the DLL via JNI.
- If using Delphi, you can link to it in code easily.
- We even provide a Win32 service which listens on a specified TCP/IP port to service requests from non-win32 platforms.
However, a situation has come up wherein a potential client who uses C# to develop their application has requested that we port the DLL over to the .Net platform. Why? Because one of their huge selling points is that, "Our software is pure .Net".
I won't go into my initial "WTF?" reaction, and rant against rewriting a perfectly good and accessible calculation engine just for the sake of what amounts to code related religious fanaticism. When it comes down to it, the customer asks, and we do what we can to accommodate.
With this in mind,we looked at a few possible options: rewrite it the 70K+ lines of code in C#, license our source code to the prospective customer and have them rewrite it, or investigate Delphi for .net.
We dismissed option #1 as being too time consuming for a small company, considered option 2 as sub optimal, and have for now gone with option #3. In about a week, I was able to compile a portion of our calculation engine DLL using Delphi for .Net.
And then I ran a test calculation... and waited... and waited. It took 2 seconds to complete a simple calculation which would have been done in the merest fraction of a second using the native Win32 DLL. We are talking about a performance differential of around 99%. Le ouch!
So now I need to look at profiling the .Net assembly, and try and figure out what the hell is causing the problem. The .Net platform could not be this horrible slow with numeric calculations, can it? It must b some sort of Borland.Vcl.dll .net implementation issue, right? Right!?!?!?
Gahhhhhhhhhh! If anyone has any ideas and/or suggestions, I'm ready and willing!
Subscribe to:
Posts (Atom)