Thursday, December 26, 2013

How Do You Debug Your Application?

I've been discussing my foray into programming again. There were times when I questioned this decision...mostly as I re-discovered the reason people write about best practices in programming and application design while getting a better grasp on the thought behind some of the questions asked on the programmers Stack Exchange site.

I'm nearing my self-imposed deadline for a beta (well, alpha release is probably more accurate) before I cut this project loose and start moving into something new. And of course, I ran into a glitch during what was supposed to be a simple feature addition.

What was supposed to be a simple sample program had become a bit of a mishmash in goal alterations, so the simple thing I originally mapped out in my head was now a little more complicated. How was I going to troubleshoot this problem?

I did it in what is probably the hard way. I would place messages in various spots along what I had traced to be the application's flow of logic. If it were about to call a subroutine, I placed a message box with a diagnostic message so I'd know that the program had hit that particular area of code and, when appropriate, it had diagnostic information about the state of a particular variable.

This way of thinking probably goes back to my first programming experiences using environments like the built-in BASIC interpreters in the Commodore and Apple II. Even when I graduated to QBASIC-esque variants there weren't many great options in debugging; later, in college, I was working in C++ IDEs where debugging facilities had improved, but we rarely worked with them. Instead the assignments tended to be small and inserting hand-made breakpoints into the application meant getting the assignment completed in time and in usable shape.

There wasn't really any emphasis on using the debugging tools.

I suppose at some point the idea of inserting code that let me know what was going on at certain points in an application became a "cross platform" way to debug a problem. I didn't have assignments or projects so large that specialized tools were necessary, and even if I did need to run a tool for debugging, I was afraid to invest a lot of time into learning a tool that would only work with one environment on one platform. It makes me relive memories of support issues past when we had people flip out because a menu changed in a new release of Office; they became dependent on the clicks to accomplish a goal rather than the skills to navigate to the feature they're seeking.

At the same time...like when I was creating this application...there were points when I wished I could easily have the IDE step through each line of execution so I could see what exactly happened at each point in execution flow. And part of me thinks this has to be built in to the Visual Studio IDE and is blatantly obvious to seasoned programmers...there's just so many buttons that by the time I figured out how to do what I wanted to do, I'd probably have solved my problem three times over using my rudimentary system of debugging. 

It occurs to me that I hadn't run into a case where debugging tools were taught, required, or demonstrated. "Insert code here to tell the programmer what is happening at this point" was portable...it works in compiled and scripted applications and in most languages...and seemed rather intuitive.

So how much am I missing in approaching debugging this way? Is this even a legitimate method used by professional programmers?

Is there a decent tutorial for a preferred method of debugging, or how to approach debugging? Or is debugging another skill acquired with experience?

I suppose I should fret over this more, but my next project will involve playing with some Ruby on Rails; Visual Studio's built-in debugging tools will probably not be of much assistance there. I suppose this also means having to figure out a different way to debug applications with the change in platform...

No comments:

Post a Comment