Wednesday, July 25, 2018

Our First Official Trike Ride

I've been focusing recently on weight and health. Haven't really been talking about it much for a few reasons (except on the Geeking After Dark podcast...) but the biggest recent development was the acquisition of recumbent trikes.

It had been quite an adventure in itself (dear Customs and Border Protection: eff you) but my son and I finally got our modified First Avenue trikes!

At first we primarily rode in our driveway; we made 6 circuits the first day, then 5 the next. It was our basic shakedown, learning the handling on the trikes and getting a feel for the brakes and seating. We also needed to build familiarity with the shifting, because our trikes are equipped with Nuvinci hubs instead of relying on derailleurs for changing distinct gears.

(By the say, the Nuvinci hubs are utterly amazing...expensive, but amazing.)

There was a lull in riding because Little Dude had a friend over for a few days, and after that, weather decided it didn't want to cooperate. But the weekend rolled up, the rain broke, and we were determined to try riding on the road!

We broke down the trikes; in order to rack them, the seats, trunk bags and accessories have to be removed. Once the trikes were secured to the car rack, we got the equipment fit into the back seat of the car and filled our insulated water bottles (flavored with tablets that add caffeine, vitamins, and some refreshing...somethings...to keep you from feeling like you want to pass out while exercising) before trekking to a valley area about half an hour from our home.

The area we rode is near my childhood home; I was relatively familiar with the area due to working at a historic site and attending the church in that valley area. Other than those two things the area is populated by farmers and was mostly a closed, paved loop.

The closed circuit path meant that it was mostly local traffic, but it was decently paved as a two-lane road (without a center line, though; it was kind of narrow in places, but I figured the sparse traffic would make this a nice introduction to road riding.)

In my head I picture the path as an elongated 2-dimensional Pokeball; the top half is a higher elevation, and the middle of the circuit is bisected by a packed dirt/gravel road, which would have cut the travel in half and avoided having to climb the hill to the upper part of the road circuit.

I had forgotten about that bisecting road; my father reminded us about it when asking about our route. I have been exercising a little using a pedaling machine under my desk (not a perfect simulation of a recumbent bike, but better than nothing) for several months as well as some basic workout routines from a fitness specialist. My son hasn't been working out; aside from our rounds on the driveway (which, to be fair, is about 600 feet long and rises a little under 20 feet from entrance to parking flat) riding the recumbent on the road was kind of cold turkey exercise for him.

We de-racked the trikes and re-equipped them, mounting the trunk bags, water bottles and seats. We sat down, adjusted mirrors, and I explained the proposed route along with reiterating my warnings about watching for traffic and staying to the right (I was perpetually anxious about our road riding since he is not experienced with driving, let alone traveling on the road.)

"Little Dude, we have two options. The first is the one I was first thinking of...it's paved, but it's longer, and there's a big hill climb to deal with. The other is the one Grandpa mentioned; it's a packed gravel road that cuts travel in half and avoids the hill, but it's going to be bumpy. Which one should we take?"

He thought about it for a moment and said, "We'll take the long way."

I was so proud of him!

"Okay. Ready?...let's go!"

I launched Strava, an app on my iPhone for tracking our exercise stats, and we started our ride. The first part was relatively flat; some small inclines, but nothing we couldn't really handle. Little Dude was slower due to not having worked out and developed the leg muscles needed for the leg presses that pedaling recumbent trikes model. I did get ahead of him at times, but I kept an eye on the mirror and if he started to fall pretty far behind I'd pull off and wait for him to catch up.

We did pass some family friends who were out for a walk with their dog. I didn't know they even had this dog...I told my parents that I didn't realized they had a pet bear, because this thing is the size of a polar bear cub. I mean, it's HUGE. And fluffy. It was a giant white furball the size of a dwarf horse, and it was at least as tall as my head was positioned on the recumbent trike.

"Oh, that's Rufus," my parents said.

"Rufus. They named this bear-sized dog Rufus." I had trouble wrapping my head around the juxtaposition of a pet named Rufus that looked big enough to pull a sled of kids.

We said hello as we rolled by them and the dog just sort of gazed quizzically at the two overweight riders on the weird tadpole machines; I was thankful it didn't decide we were invading its space and attack or bark, as I was certain the force of the bark might blow us into the corn field.

At one of the pause points, I pointed out that we were approaching the hill.

"Last chance. We can turn here and take the halfway road, or we continue up that," I said as I gestured towards the visible escalation in pavement.

Little Dude rested a few minutes, took a swig of water and said he was ready to go up the hill.

Oh gawd...the hill was tougher than I thought. I had to stop a few times on the incline, as my muscles would hit the point of failure. If I was hitting that, I knew Little Dude was having it harder than I was. I could see in my mirror that he was stopping along the road, but after a few minutes, I'd again see his feet pumping the pedals as he made progress forward again.

We paused several times. I didn't mind; I was amazed Little Dude, who was not accustomed to this kind of physical work, was still soldiering on. Forward progress was forward progress!

He caught up to me. I pointed at the house in front of us where the road sharply curved into the grove of trees; "We're not far now. Once we hit that curve, we not only have shade, but the road doesn't keep climbing like this."

He didn't really seem to believe me, but at this point we didn't have much choice but to continue on. "Ready?"

We kicked forward again. Eventually we took the curve and stopped in a driveway where we didn't have to worry about traffic and could enjoy the slight breeze.

Little dude was red in the face and had rivulets of sweat dropping from his head and darkening his shirt. "I can't feel my legs, Dad," he said.

"You mean they feel like something is wrong, or they're tired?"

"I think they're just tired."

Dude is sensitive to dehydration, and it was hot today. He was drinking from his water bottle but I knew that it would be running low by now. We rested a bit in the shade, hands laced behind our heads to allow our lungs to expand wider and take in more oxygen, before I asked him if he was feeling better.

"I just need a minute or two," he said.

"Do you want to call it quits," I said. I figured I could run ahead and get the car, pack up my trike and return for him. He was looking really tired and I was a little concerned about how red his skin had become.

Again he thought a moment before replying, "I'm not going to quit, Dad!"

I can't really describe the pride I felt, seeing him push through his aches and sore legs to keep moving forward on his first real ride on the trike. "I'm not going to quit." He was not taking the easy way out!

"Okay. We'll keep going. Tell me when you're ready!"

We had made it up the steepest, longest part of the ride. We had a relatively flat ride before hitting the downward portion of the trip; our bike computers registered a top speed of a little over 28 miles per hour (or, as he put it, "THAT WAS AWESOME!") At the peak speed, we zipped by an older couple sitting on their porch. If Rufus thought we were strange, I couldn't imagine what this couple thought of the two overweight guys on these weirdly configured wheeled lawn chairs were doing as the tires hummed along the pavement and the pilots whooped with glee at the air whipping through our hair.

We pulled off the road and stopped next to the car. Despite feeling pretty good, standing up proved to be a challenge, as my blood pressure felt like it was dropping dramatically as I stood upright for the first time in over an hour. Little Dude asked for a few minutes before having to peel himself out of his seat and disassemble the trike for racking.

I slowly released the pins and quick releases on my trike that held the seat to the frame, disconnected accessories and bags, then steeled myself to get the trike lifted onto the rack. "Take your time, Dude," I said. "You have a lot to be proud of."

I had packed protein bars for us with the intention of stopping at a picnic area on the path to rest, but as our water ran low and there was a threat of impending rain, I nixed the idea. Little Dude had scooped up an empty Red Bull can with the intention of giving it to his Grandfather for recycling and deposit redemption; their house was on the way home, so I figured we'd stop, get more water and have our protein bars while visiting.

Strava said we spent 42 minutes of actual travel time (it pauses automatically if GPS doesn't show us moving) and had climbed 263 feet over a trip of 4.28 miles. I don't think that was bad at all for a first trip out!

We've started learning a few things about the trip. We remembered to mount the blinking red lights to increase our visibility...but forgot to turn them on (halfway through the trip I activated them.) We also brought helmets, but forgot to actually wear them, which didn't matter quite as much since trikes are a little harder to tip over and state law didn't require the helmets for people our age. If there were more traffic, I'd have turned around to grab them. As the situation was...I let it go, figuring we might feel a little cooler with a breeze as the sun was beating down rather hard for the first legs of the journey.

I think we also need to have more water-carrying capability. Next time we're in the shop, I think I'll ask them to install an additional water bottle cage on each trike and we'll go shopping for another insulated bottle. In the meantime I ordered a set of pannier bags for my trike so I'll have increased cargo capability, then I'll see if I can find something that can hold water without fear of spillage in the panniers.

The weather has once again decided to work against us...we've had days of rain, complete with a constant flash flood watch culminating now into a flash flood warning...so we haven't been out riding again. But we do plan to head out again this weekend. The weather is predicted to break and I've been scoping out a possible bike path to try an hour away. Little Dude is looking forward to the next ride, and I have to admit that I'm more than a little anxious to hit the pavement again as well.

I recounted our trip to my wife, and we were both extremely proud of Little Dude and how hard he worked to keep going. He felt bad about the slower pace of the trip, feeling that he was holding us back...but I told him, truthfully, that there was no reason to feel bad. He was working on developing the leg muscles, he wasn't used to riding, and we both had work to do to get more proficient in riding. I had no problem with our pace...the important thing was we did it, and we pushed on. He didn't take the easy road or cut corners.

And more than that, he was still looking forward to the next ride! For now, we're keeping an eye on the forecasts and will have the trikes ready to rack. Allons-y!

Sunday, June 17, 2018

Why the Blog Hiatus?

In the past few years (wow, has it really been a few years already?) things have really changed for me. We (meaning my family) have had some upheavals and trials. For a while it seemed like there was no hint of good news without a wave of misfortune following close behind.

I'm the type of person with a brain wired for routine. I managed to keep some constants, such as recording the Geeking After Dark podcast with regularity, which helped maintain the illusion of control over my own life. I had a friend with contacts that managed to help me find employment after going back to rural PA. And of course there is the constant barrage of terrible that is the state of our country under the current administration, and worse, the support the people of this area show for the ideology of said administration, which does little to quell fear of what other people will do if given the opportunity to show their "true colors" without being held accountable for their actions.

I didn't realize it at the time, but the combination of stresses were taking a toll. Both my physical and mental health were in decline until it reached a point where I couldn't keep turning a blind eye to the situation.

Back in the beginning of January I had a "coronary incident;" my wife took me to the ER in the wee hours of the morning and I spent the next day having tests run (ever have a catheter run through your arm to see if you have a blockage to the heart? Highly not recommended.) It turned out to not be a heart attack, but pericarditis, in which an inflamed sac of tissue around the heart swells and imitates the first symptoms of a heart attack.

Ironically, I had already agreed to take the preliminary steps for a more thorough admission to a weight clinic at the hospital, and my first appointment was three days after my discharge from the ER.

I've had appointments to address a wide range of issues from the stresses and...while I hesitate to call it this...the psychological trauma that has been building up over the previous two years. I have doctors coordinating in fields across endocrinology through bariatrics to try making progress on my health. Some of this I've already brought up in the podcasts, some of it I never really talked about because I didn't think it was worth mentioning.

I've been working a lot on my programming in Golang at my new job position, and thoroughly enjoying it. Usually I'd blog some thoughts or tidbits about what I learned. Then one day I looked at this blog and realized I hadn't written anything in months...I couldn't believe how much time had passed while I was in some kind of mental fog.

That isn't to say I haven't made progress. Since January 10th, I've lost 106 lbs. I've gone off several medications while cutting back on others.

Doctors have not only been forcing me to work on eating "healthy" stuff like...vegetables (yuck) but to exercise more. My current job is heavy on the sitting-at-the-desk duties, so I've been using an under-desk pedaling machine to work my legs. As summer approached I started looking at an alternative exercise that was tolerable for my taste and lifestyle; I started investigating recumbent biking. As I type this, there are two recumbent trikes on their way to an almost local recumbent dealer earmarked with my (and my son's) names.

While treating some aspects of my various diagnoses had made progress, one side effect has been a diminished passion for side projects. My work on side programming projects has slowed down, but I still have an idea bouncing in my head that I'm thinking of trying to explore.

So, why the blog hiatus? It boils down to the stresses of the past couple of years being addressed, and not realizing how much time had passed. I haven't disappeared. Nothing tragic happened. I've simply changed my focus for a bit. As a result, my health, so far, has been slowly improving, and I've been nursing a new obsession with recumbent triking.

If things continue to go as I hope, I'll get back to blogging more. I'll continue programming and making progress on the learning front. And maybe, just maybe, I'll continue to improve my diet and exercise lifestyle changes with the help of a new recumbent trike!

Monday, March 12, 2018

Golang: Is The Mutex Is Locked, And Finding The Line Number That Did It

Quick summary of the situation, giving enough details to highlight the problem but not giving proprietary information away...

I have a program that queries a service which in turn talks to a database. The database holds records identified by unique rowkeys. I want to read all of the records, as far as the database knows of their existence, which I can get through an API call, iterating in discrete steps.

The utility I created pulls a batch of these keys, then iterates over them one by one to determine if I want to make a call to the service to pull the whole record (I don't need to if the key was already called before or previously analyzed on a previous run of the program.)

Seems relatively simple, but this is a big database and I'm going to be running this for a long time. Also, these servers are in the same network, so the connections are pretty fast and furious...if I overestimate some capacity, I'm going to really hammer the servers and the last thing I want to do is create an internal DDoS.

To that end, this utility keeps a number of running stats using structs that are updated by the various goroutines. To keep things synced up, I use an embedded lock on the structs.

(Yeah, that's a neat feature...works like this:)

type stctMyStruct struct {
sync.Mutex
intCounter int
}

After that, it's a simple matter of instantiating a struct and using it.

var strctMyStruct stctMyStruct

strctMyStruct.Lock()
strctMyStruct.intCounter = strctMyCounter.intCounter + 1
strctMyStruct.Unlock()

Because the utility is long-running and I wanted to keep tabs on different aspects of performance, I had several structs with embedded mutexes being updated by various goroutines. Using timers in separate routines, the stats were aggregated and turned into a string of text that could be printed to the console, redirected to a file or sent to a web page (I wanted a lot of options for monitoring, obviously.)

At some point I introduced a new bug in the program. My local system was relatively slow when it came to processing the keys (it's not just iterating over them...it evaluates them, sorts some things, picks through information in the full record...) and when I transferred it to the internal network, the jump in speed accelerated exposure of a timing issue. The program output...and processing...and web page displaying the status of the utility...all froze. But the program was still running, according to process monitoring.

I first thought it was a race condition...something is getting locked and not releasing it. But how can I tell if a routine is blocked by a locked struct? Golang does not have a call that will tell you if a mutex is locked, because that would lead to a race condition. In the time it takes to make the call and get the reply, that lock could have changed status.

Okay...polling the state of mutexes is out of the question. But what isn't out of the question is tracking when a request for a lock is granted.

First I changed the struct to have an addressable member for setting the state of the lock.

type stctMyStruct struct {
lock sync.Mutex
intCounter int
}

Next I created some methods for the struct to handle the locking and unlocking.

func (strctMyStruct *stctMyStruct) LockIt(intLine int) {

chnLocksTracking <- "Requesting lock to strctMyStruct by line " +strconv.Itoa(intLine)

tmElapsed := time.Now()
strctMyStruct.lock.Lock()

defer func() {

chnLocksTracking <- "Lock granted to strctMyStruct by line " + strconv.Itoa(intLine) + " in " + time.Since(tmElapsed).String()
}()

return
}

func (strctMyStruct *stctMyStruct) UnlockIt(intLine int) {

chnLocksTracking <- "Requesting unlock to strctMyStruct by line " +strconv.Itoa(intLine)

tmElapsed := time.Now()
strctMyStruct.lock.Unlock()

defer func() {

chnLocksTracking <- "Unlock granted to strctMyStruct by line " + strconv.Itoa(intLine) + " in " + time.Since(tmElapsed).String()
}()

return
}

LockIt() and UnlockIt() methods are now added to instances of stctMyStruct. When called, the function first sends a string into a channel with a dedicated goroutine on the other end digesting and logging messages; the first acts as a notification that the caller is "going to ask for a change in the mutex."

If the struct is locked, the operation will block. Once it is available, the function returns, and in the process runs the defer function which sends the granted message down the channel along with the elapsed time to get the request granted.

How does it know about the line number?

There's actually a library function that can help with that; my problem is that it returns too much information to not be a little unwieldy. To get around that, I created a small wrapper function.

func GetLine() int {
_,_,intLine, _ := runtime.Caller(1)
return intLine
}

If you look at the documentation you can get the specifics of what is returned, but Caller() can unwind the stack a from a call by the number of steps you use as an argument and return the line number, package/module, etc...in my particular case I'm using one source file so I only needed the line number.

Using this, you can insert function calls to lock and unlock the structs as needed. I added the methods to each struct that had a mutex or rwmutex. Using them is as simple as:

strctMyStruct.LockIt(GetLine())

This solution provided a way to trace what was happening, but there is a performance cost. Defer() adds a few fractions of a second each time it's called and I used a lot of locks throughout the program which added up to a significant performance hit. Using this technique is good for debugging, but you have to decide if you want to incur the overhead or find a way to compensate for it.

So what was my lock issue?

I set the goroutine monitoring the locks to dump information to a file and traced the requests vs. granted mutex changes. There was a race condition in a function I used that summarized aggregated information; A lock near the beginning of the summary was granted, and while pulling other information, it requested another lock. The second one was an operation on a struct that was held by a process waiting to get a lock on what was being held by the beginning of the summarize function.

It was a circular resource contention. Function A held a resource that Function B wanted, and Function B had a resource function A wanted. The solution was to add more granular locking, which added more calls but in the end meant (hopefully) there would be only one struct locked at a time within a given function.

Lesson learned: when using locks, keep the calls as tight and granular as possible, and avoid overlapping locks as much as possible or you may end up with a deadlock that Go's runtime wouldn't detect!

Friday, March 2, 2018

On The Importance of Planning A Program

I'm not a professional programmer.

I'm not sure I could even qualify as a junior programmer.

What I have been doing is programming at a level that is above basic scripting, but below creating full applications. I've been churning out command line utilities for system activities (status checking and manipulating my employer's proprietary system, mostly, along with a bevy of Nagios plugins) with the occasional dabbling into more advanced capabilities to slowly stretch what I can accomplish with my utilities.

That said, I've been trying to reflect on my applications after they've been deemed "good enough" to be useful. In a way, I try running a self-post-mortem in hopes of figuring out what I think works well and what can be improved.

I was recently in a position where I had to create a utility, then months later, got permission to rewrite it, giving me a unique opportunity to take an application that had a specific set of expectations for output and let me refactor its workflow in hopes of improving performance and information it gathered in the process.

For reference, the 10,000 foot view is that I have a large set of data from a large database, and we wanted to dump the contents of that database, using an intermediate service providing REST endpoint API calls, to save each record as a text file capable of being stored and uploaded in another database. A vendor-neutral backup, if you will...all you need is an interpreter that is familiar with the text file format and you could feed the contents back into another service or archive the files offsite.

It seems like this would be a small order. You have a database. You have an API. The utility would get a set of records, then iterate over them and pull records to save to disk.

Only...things are never that simple.

First, there's a lot of records. I realize "a lot" is relative, so I'll just say it's in the 9 digits range. If that's not a lot of records to you, then...good on you. But when you reach that many files, most filesystems will begin to choke, so I think that qualifies as "a lot."

That means I have to break up the files into subdirectories, especially if the utility gets interrupted and needs to restart. Otherwise filesystem lookups would kill performance. Fortunately there's a kind of built-in encoding to the record name that can be translated so I can break it down into a sane system of self-organizing subdirectories.

Great! Straightforward workflow. Get the record names. Iterate to get the record contents. Decode the record name to get a proper subdirectory. Check if it exists. If not, save it.

Oh, there are some records that are a kind of cache...they are referred to for a few days, then drop out of the database. No need to save them.

Not a problem, just add a small step. Get the record names. Iterate to get the record contents. Check if it's a record we're not supposed to archive. If we are, decode the record name to get a proper subdirectory. Check if it exists. If not, save it.

During testing, I discover there are records whose records cannot be pulled. The database will give me a record name but when I try to pull them, nothing comes back. That's odd, but I add a tally of these odd names and a check is inserted for non-200 responses from the API calls.

Then there are records that I can't readily decode. They're too short and end up missing parts for the decoding process. At first I write them off as something I have to tally as an odd record in the logs, but discover that when I try pulling them, the API call returns an actual record. I take this to the person who has institutional knowledge of the database contents and after examining the sample of records, states that it looks like the records were from an early time in the company history.

Basically, there's a set of specs that current records should follow, but there are records from days of yore that are valid but don't follow the current specs.

So there are records that should be backed up...but don't follow the workflow, where I have functions that check for record validity through a few tests before going through the steps of making network calls and adding to the load on the servers acting as intermediaries for the transfer. To fix this, I insert a new pathway for processing those "odd" records when they're encountered, so they end up being queried and translated and, if they are a full record, saved to an alternative location. The backups are now separated into the set of "spec" records and another "alternative" path.

The problem is that this organic change cascades into a number of other parts of the utility; my tally counts for statistics are thrown off. The running list of queued records to process have to take into account records that are flowing into this alternative path. Error logging, which also handled some tallying duties since it was an end-of-life for some of the records to be processed, weren't always actually errors but actually a notification that something had happened during the process that was helpful during tracing and debugging but a problem when it would mark certain stats off before the alternative record was processed.

That one organic change in the database contents during the history of the company had implications that totally derailed some of the design of my utility that took into account only the current expected behavior.

In the end, I lost several days of debugging and testing when I introduced fixes that took into account these one-offs and variations. What were my takeaways?

It would be simple to say that I should have spend some days just sketching out workflows and creating a full spec before trying to write the software. The trouble is that I didn't know the full extent to which there were hidden variations in the database; the institutional knowledge wasn't readily available for perusing when it resides in other people's heads, and they're often too busy to try coming up with a list of gotchas I could watch out for in making this utility.

What I really needed to do was create a workflow that anticipated nothing going quite right, and made it easy to break down the steps for processing in a way that could elegantly handle unexpected changes in that workflow.

After thinking about this some more, I realized that it was just experience applied to actively trying to modularize the application. The new version did have some noticeable improvements; the biggest involved changing how channels and goroutines were used to process records in a way that cut the number of open network sockets dramatically and thus reduce the load on the load balancers and servers. Another was changing the way the queue of tasks was handled; as far as the program was concerned, it was far simpler to add or subtract worker routines in this version than the previous iteration.

I'd also learned more about how to break down tasks into functions and disentangle what each did, which simplified tracing and debugging. Granted, there are places where this could still have been improved. But the curveballs introduced as I found exceptions to the expected output from the system, for the most part, just ate time as I reworked the workflow and weren't showstoppers.

I think I could have definitely benefited from creating a spec that broke tasks down and figured out the workflow a bit better, along with considering "what-ifs" when things would go off-spec. But the experience I've been growing in my time making other utilities and mini-applications still imparted improvements. Maybe they're small steps forward, but steps forward are steps forward.

Saturday, January 13, 2018

Regulations and Dieting (and Surgery)

This is a few thoughts that involve something common in the new year; dieting. Well, tangentially diet related.

Part of the issues I've had cascade down in the past few months...thanks life!...has led to appointments with the rather new bariatric unit at the local hospital unit. They take a whole-in approach of using a team of nutritionists, fitness experts, gastric surgeons, psychologists...the whole nine yards...to create a program with support system for patients.

Part of the intake process meant reviewing your history. This is where I learned something nifty (beyond this machine that weighs you while zapping you with a current that measured all sorts of density information regarding the different kinds of body fat and densities in your body to come up with a profile of good and bad stuff in your body).

They asked about my past history and I told them about the gastric bypass procedure I underwent many years ago...I believe it was around 2009. April. Somewhere in there. My memory is fuzzy.

At the time, the local hospital system didn't really have a bariatric unit. While they very much seemed to support the idea that if you're fat, most of your illnesses and afflictions were weight-based and you needed to lose weight to deserve to be better, they were not well known for their "let's cut parts of the digestive system apart to help lose weight."

There was another hospital, about an hour away from us, that did have a small bariatric surgery unit. They took me into the program, agreed to do the surgery if I lost X amount of weight first, and after reaching that milestone I had the surgery.

Not long after, during the latter phases of physical recovery, I unceremoniously discovered that not only did my surgeon retire, but the hospital killed their bariatric surgery program. There was no notice. There was no letter, no email, no announcement ever reached us. Just...nothing. No more appointments kept.

I soured on the medical system a little more at that point. There was emphasis on how important a support system was...and there is certainly no shortage of continued feeling that when a doctor looks at you, your weight is first a foremost on their mind when figuring out how much a person is worth.

One day I had a consult about something at the local hospital and they mentioned the bariatric surgery, and how I could get followup at the other hospital.

"We can't," I said. "They shut down their bariatric unit."

"They restarted it a little while ago," they said.

Turns out, with little (read: no) fanfare or notification, they revived their bariatric unit. I have no doubt the doctors I worked with are gone; my surgeon had retired, and I can't imagine the younger doctors stuck around once their specialty had been shut down.

This came at a time when fat people were becoming (medically) profitable. Oh, sure, we're still a huge expense in cardiac care (and in this time the local hospital became a leader in cardiac care), but now some of those costs are being recouped through insurance companies through growing sleep apnea care, diabetes drugs and bariatric surgery. What was justification for treating people as sub-human was becoming a PR race to open the best fat-care centers, which before was the market for hucksters and easy diet schemers on television ads.

In other words, upon hearing that the other hospital had re-opened their bariatric unit without any announcement to former patients, I figured it was because it was becoming fashionable and probably profitable to do so. I certainly didn't trust them to give a damn, though. They didn't notify their old patients about it. They expressed no damns about my status. So...screw them.

The annoying thing is that the local hospital decided to focus more and more money into developing a local bariatric/weight loss program. As time went on they moved more staff into specializing on weight care. They repurposed a building just for weight loss. They focused resources on their weight loss center.

But when the topic of weight loss came up with my appointments, the moment my surgery history came up it was suggested I drive another half hour to the other hospital and continue care there.

It was during intake that I finally found out why. During the consult they mentioned something about checking the size of the stomach pouch, as it was obvious I could eat more than I was supposed to be able to. My history came up, and she said something about going to the other hospital.

I recounted my history and my distaste for dealing with a hospital that made it so blatant they didn't give a damn about their patients. She said that she could talk to the surgeon in the local hospital's weight clinic, but she knew what he'd say...no, he wouldn't work with me on it. That was when I learned why.

The government made rules.

See, to make hospitals "accountable" (that's a big buzzword for hospitals now, not just schools!) they were getting evaluated based on patient followup. In this example, I was operated on by hospital A. They had a program they wanted to end, and they did...essentially dumping their patients.

I ended up going to hospital B, my preferred hospital for most medical issues since I only went to A for a procedure B refused to do at the time. But this means that if anything was bariatric-related, B was getting (federally) evaluated for my poor outcome. At some point it seems A was pressured to re-open their bariatric program and make available their resources to old and new patients (although they didn't advertise it...take that as you will.)

That was why I was repeatedly "encouraged" to go to another hospital for some weight treatment followups. It's also why I'm not able to access certain resources at a hospital that in the years following my surgery dumped not insignificant resources into developing a "cutting edge" bariatric unit.

Once again the government is interfering in efforts they don't understand. Or at a minimum lots of hands in the pot have created a system that benefits not the patient, but some other interests, with the net effect of screwing the patient.

In the end I still have to go through their weight clinic, just with some options limited. I get to begin the new year miserably tracking calorie counts and using words like "carbs" and "abs" and "veggies," and dealing with the neuroses that I know will flare up while pursuing the accurate tracking of goals.

Will I be successful? Will I find more reason to distrust and/or outrightly dislike the hospital? Or will I fail miserably? Time will tell. But if you'll excuse me, I have to go prepare a big old egg patty with...egg. Lots of protein. Minimal carbs. Low calorie!

I really miss food.

Friday, December 22, 2017

Golang Web Server: Don't Do This

I still consider myself new to programming. The new job allows me to create a lot of small system tools using Go mostly for augmenting monitoring and create utilities to replace manual API calls using JQ and CURL with single executables created in Go. It's been a wonderful learning experience.

Sometimes I try to add some new features to utilities that are snazzy but also a bit of an experiment.

This is a bit of reflection on the design I originally used and I am not in a mood to pull out layers of source code to show what I had done, especially if no one is asking for it. But I will describe the basic design in an effort to not only avoid implementing it that way again but to warn others not to make the same design pattern mistake.

The utility is mainly a long-running process that is interrogating one of our services for database information. It gets raw data from the database, pulls some stats like record size and type, and tallies the information. Millions and millions of records.

What if, I thought, I provided a peek into what the state of the tallying is beyond what I already had showing? It would output a count of some basic information as a one-liner every thirty seconds to the console, but that wasn't good enough. I thought, why not create a web interface that would output a simple text page of information?

Go loves channels. And I had several "worker goroutines" that handled specific tasks in the tally program, passing messages to a coordination process that serialized scheduling record analysis, directing results, and monitoring the state of various workers. Breaking them up made things pretty fast once I stuck in a few tweaks here and there.

Adding a web server routine wasn't hard. Then I thought, I could just add a couple of channels to plug them into routines that held statistics.

Here's where I made what later turned into a mistake.

Instead of individual handlers, I created a single handler that took message strings via channels. The messages consisted of a random ID and a type, where the type was the page request.

The reader on the other side of the channel split the message, used a select{} to determine which page it should construct, and returned through another channel the page with that ID string prepended. The receiver on the other side would look for the message and see if the ID belonged to its request. If it wasn't the proper ID, it just re-fed it to the channel, hoping that the right recipient would pick it up later, and the next message in the channel was intended for that particular reader. Line by line the page was fed back down the channel, with the ID attached to each message, until the ID was attached to a message: "END OF PAGE", at which point the page was done and connection closed.

Don't do that.

The thing is, this seemed to work. I opened a web browser, opened the page, and it worked. I could request the different pages and it worked just fine.

It worked until one page got kind of big and I opened two web pages to the server. Something seemed to get "stuck." One of my statuses gave a snapshot of the fill state of some channels and I noticed some of the web-related channels were...throbbing? Growing huge and slipping down, as if revving up with more lines of messages than should possibly be needed. Something was getting misdirected and the lightweight speed of goroutines meant it was flooding channels with useless information.

No problem, I thought. I'll add a third field, a counter, which once it reached a certain level would simply discard the message. The web page was meant to be read by a person who was trying to get some stats on the status of this utility while it was running, not the general public...refresh the page, hopefully you'll get a working reply that time. Sloppy, but might work.

Tested again. It seemed to keep the channels from getting as clogged up, but I still had some kind of crosstalk that when pages grew larger, and it wasn't hard to create some kind of denial of service from the web server when two different pages were opened. It almost seemed as if sometimes the two pages got completely confused which tab was supposed to get what page.

Maybe it was too easy to get messages mixed up because pages were feeding line by line. I went through the page composition and instead of feeding each line through, I had the process create one big string and feed the result.

This cut down on responsiveness but increased reliability. Kind of. It was significant, but not enough to be proud of. If anyone tried pulling a web page from the utility while someone else used it there was a non-zero chance it would get a weirdly formatted page, if not a timeout.

After finishing some work on other utilities, I decided to refactor the 4 web pages into their own handlers with separate functions and move some of the information being read into global structs with mutex's for protection. Before making the change I ran a test with Bombardier, a handing web server throughput tester. The test totally choked on the channel handler architecture.

I refactored, separated out the page composition into individual handlers, and eliminated channels for web page feeding. No more IDs. No more parsing out replies. No more tracking how many times this particular message is making rounds before "expiring" it.

Bombardier hammered away on the server with no issues. Multiple tabs reading different web pages? No problem. The biggest trigger for problems, clicking back or a link to one of the other pages while a large page hadn't finished rendering, was no longer a problem.

What I wanted to do was find a way to read a URL request and use one handler to interpret what the client wanted, so I didn't need a number of individual handlers defined. I'm pretty sure I still could do that, but I think the weakness was in using channels with an associated ID to parse replies back to the client from a dedicated goroutine holding stats.

The solution I ended up using was individual functions that read from a global struct holding the current state of statistics, and this was protected with a lot of locking.

I suppose another way to do it, with channels, would be finding a way to spawn dedicated channels with each request so the replies didn't need parsing or redirecting; a channel with multiple readers has no guarantee of who is going to get the message at what point. This kind of fix seemed needlessly complicated, though.

I suppose I could also have enhanced the global statistics struct to have functions associated with it, so calls could be made that would automatically lock and reply with information requested by callers. The utility is relatively small, though, and I thought that implementing that would have been more complicated than necessary. I'm not sure if this would enhance the speed of the program, though, and may be worth trying for the learning benefit.

But what I definitely now know is not to pass web pages as composed lines with an ID tagged down a shared channel for a reader to parse and decided, "Is this line meant for me? No? Here, back into the channel you go, floating rubber ducky of information, while I read the next ducky...float away!"

Don't do that.

Sunday, November 26, 2017

StackOverflow and Newcomers

Stackoverflow (SO) is the premiere question and answer site for programmers. It's a joke now that when SO goes down, programmers go home because no work can get done. It is their mission to make life better for programmers, and the men and women working behind the scenes at SO have poured much sweat and tears into growing a useful community for programmers to share solutions to various problems encountered in their algorithm-laden lives.

That is not to say there aren't issues, though. As the site has grown (and it is bit on the huge side now) SO has had to make decisions that define (and refine) the site's character, and not all of these desicions have passed without detractors. They have also had to try addressing criticism of the site, and one of the most common criticisms seems to be related to how (un)welcoming the site can be for newcomers.

I think I can relate to this. I am not a programmer by trade, but I do try to create useful utilities for use in my day job and enjoy programming in at least a hobbyist capacity. I am not very confident in my abilities, though, and definitely do not need someone to remind me of an obvious skill gap (why do you think I'm asking the question in the first place?)

I do not have the answers regarding how to make SO more welcoming to beginners. Perhaps once a community grows to a certain point it naturally fractures into a strata of people who are skilled to a point where they aren't aware of their own bias against lesser-experienced individuals. Or maybe there are rules in the system that encourage what one person interprets to be a "man up, you snowflake!" mentality while an insecure individual interprets the same feedback system to be validation that they don't have what it takes to join with programming peers.

I suppose that when so much of the technology culture centers on a "Brogrammer" mentality rife with competition using knowledge and perceived cleverness as a ranking system, it's natural for some snark to become ingrained in interactions among programmer peers. It's not hard when reading some comments and answers to a SO question to sense a tone of judgement, that the questioner must pass some bar of having earned an answer before they may have one, something beyond the basic search of the site for the same problem before duplicating it.

There have been cases where people will take more time to criticize the questioner than it would have taken to edit or refine the question into something useful and post an answer.

Sometimes it seems you can do everything seemingly right but still fall short in someone's judgement; the ability to down vote a question while leaving no constructive feedback and incurring no penalty in the process (except to the question-asker) seems like a pretty obvious way to discourage interacting with the community for help.

Note that I'm not saying down votes are necessarily bad, although I do wonder if alternative feedback methods could be useful. I'm saying that one of the more frustrating interactions on the site, in my experience, stems from being penalized and not knowing why; if you down vote, maybe you should have to leave some constructive feedback or enhancement to fix the problem or take some penalty to your own Internet-points reputation score.

For example, I recently had trouble with an intermittent panic when exiting a Go utility and posted to StackOverflow for help. I posted a title that succinctly summarized the issue. I posted the panic message. I posted the function definition. The panic had a line number from the definition that seemed to trigger the intermittent error; I posted the specific "line X is..." followed by the line of code so there was no question what snippet triggered the panic. I tagged it with appropriate tags. There were a couple of comments, and I posted a link to another question citing some code to explain (justify?) why I implemented the function call the way I did. What happened?
I took two down votes of penalty to my reputation.

In the comments I asked if the down voters could explain what I could do to improve the question for future reference. After all, SO may be for answering questions related to your immediate problems, but it's also supposed to be of use to future questioners looking to solve similar problems. Last time I checked no one explained why they did it.

The nearest I got to helpful feedback on the down votes was from one of the helpful people who submitted an answer to my question; that person speculated that it was because I had not RTFM'd to the satisfaction of some of the other users since the problematic line was in the panic and the source code for a function call used in my definition shows it probably didn't like a nil context parameter.

So as a relatively insecure beginner, I crafted a question with lots of context, source code, and clarification, only to get dinged with damage (negative reputation) by anonymous clicks from people who couldn't leave a reason why or offer feedback on improving the reference value of the question.

It shouldn't be difficult to understand why this would be discouraging to some people, especially when the goal (I thought) was to build a useful reference for many people, not (possibly) penalize someone for not meeting some arbitrary criteria for having passed a bar of RTFM to be blessed with community membership in order to be assisted without a passive aggressive backhand.

I don't count myself as a detractor of StackOverflow. I have found help from members of their community to be invaluable. I do wonder if some of the feedback mechanisms sometimes encourages certain behaviors that deter less experienced and less thick-skinned programmers from interacting while enabling programmers with the "rock star" or "ninja brogrammer" mindset to set a less friendly tone. There comes a point where it's less commiserating and sharing with a community and more a necessary chore to solve a problem, and I suspect the gray area of that transition is where new users begin complaining about the tone of the site.

Friday, November 3, 2017

Turning 40

I turned 40 this week.

Four decades. I remember there was a time I thought I'd grow up to "die alone as a hermit in the woods." I remember thinking maybe working as a programmer for Microsoft would be interesting. There was a time I thought I might become a marine biologist, specifically an ichthyologist, and study sharks. Later on I even flirted with the notion of working to become a successful author.

Today I'm not working for Microsoft. I don't live in the woods, although the town I reside in is rapidly withering economically and some might argue our tiny dot on the map is not far removed from being woodland. I don't even own diving equipment and am nowhere near the ocean (although we do live on a river that ends in the ocean, if you want to travel a few hundred miles.) The closest I've come to becoming an author was finishing and editing exactly one manuscript.

I'm pretty sure, at this point, that I have depression issues. I know it's more common today for people to talk about depression. For some people it is dismissed as an excuse of the week, or they brush it off as a "feeling blue" thing that you can exercise away or "just cheer up" to move past; "Just cheer up!" they say, totally ignoring that clinical depression is a thing.

While this little shadow has always been lingering to some degree in the back of my mind, I've had some things really raise that shadow higher in prominence in the past few years. It would take chapters of a book to cover details, but the highlight reel would include attempts by my wife's employer to eliminate her from her job using what could be (in my view, as this is my opinion) charitably be labeled slanderous accusations. That was a year-long ordeal that took a huge emotional and financial toll on the family.

After that drawn out mess, things finally felt like they were turning around. There was a light at the end of the tunnel! Unfortunately, it was a train's headlamp.

The employer I had come to rely on for emotional and financial support decided to terminate my contract, which is a nice way of saying I was sent home with a box of my belongings. Now it was my turn to plunge into a world of uncertainty, doubt, and the five stages of grief. I was blindsided and even the act of getting out of bed felt like fighting a dark shroud squeezing the life out of me.

Worse yet, if you feel like taking a moral stance and voicing support for teachers in the never ending fight over contracts, even if your family has been working in public education for decades, even if you do this by pointing out actual evidence straight from the faces of the people you feel are in the wrong, you might want to think twice if this takes place in a town that is turning into the economic equivalent of a mummy and you might have to return and look for a job. I made some statements that gained some traction among certain circles here; at the time I felt secure in the idea that my employment was secure in the land of gummy bears and unicorns. The reversal of fortune played right into the hands of depression's self doubt and uncertainty, whispering that "they" are laughing at my incompetence as I searched for job openings in a town propped up by Wal-Mart, McDonalds, a hospital system and the public education system whose administration and board are not pleased with you for writing something that was popular for a couple days among their staff.

I also experienced firsthand the silence from most of the people I had taken for granted as friends and associates from what I eventually came to regard as my "previous life."

These were two major events. I was already dealing with issues and stresses that many others have to deal with in life. These two major events just fanned the depression flames.

Now we have a national problem; we became a Trumpster fire nation. Every day came a new display of ignorance and people taking pride in how terrible they can be. I don't feel that there's much to act as a counterbalance against the papercuts of negativity he and his followers display.

It's been a long, stressful, painful period of time.

It's also been nearly a year since I started my new job, which gave me some sense of self worth again. Slowly it helped build up some sense of validation that I'm not worthless. I'm not sure if that makes sense or if I'm laying another misplaced sense of power into the hands of something in which I shouldn't emotionally invest. But for now it's there and helping me.

My family has been supportive during this emotional roller coaster, or tried to be. I don't think I quite acknowledge the good they do as much as I focus on negative things that families deal with. That's a side effect of both depression and Aspergian brain wiring, I think. Given the reflection hitting four decades of sentience has triggered, I think I need to continue trying to improve on that behavior.

All of these things have combined into a hazy mire that congealed into a cloud around me, affecting my worldview and keeping me in a perpetual weariness. I thought my birthday, despite being a magic number (I love the number 4, and 10 is a binary number as well as the number of digits on my hands and the number of digits on my feet, and is even, and possesses several other attributes that lend an irrational appreciation in my mind), would be yet another quiet passage marked by some cards and well wishes and soon forgotten. It was even on a Wednesday, my least favorite day for events to occur.

Usually the big booster in looking forward to my birthday is that it is preceded by Halloween. I love the idea of Halloween; the image of trick or treat, costume parties, awesome DIY costumes, parades, and horror movies are so much fun for me. But this year was different; the Friday before my birthday brought an announcement that indictments were coming against Trumpster acquaintances! After an anticipation-filled weekend, Monday had people brought in to testify, and we discovered one of his campaign associates had already pled guilty to lying to police and was cooperating with investigators!

We went out for dinner on my birthday with my in-laws and parents. One of the TV's played MSNBC's coverage of Trump's Russian connections and the mounting investigations. I was giddy.

My birthday was also marked by the Daily Show having an interview with Hillary Clinton. I don't know why that made me happy...I guess because she's the symbol of everything "I told you so" during the Presidential election.

These were things that worked to fight the shroud of depression whispering in my ear, and were totally counter to the idea that my 40th birthday would be quiet. These were things that were happy events for me.

There were other, not so happy events that marked the birthday-time. Unexpected shocks like the guy who rented a truck and ran over bike riders in downtown Manhattan. Because he wasn't white, it was labeled as an act of terrorism, unlike the recent Vegas shooting of around 600 people by a white guy where the fallout is basically several people going bankrupt from medical bills and modifications the shooter made to his guns staying perfectly legal and Congress clutching pearls at the idea that nothing can prevent these things from happening.

Yet another shocking event involved layoffs at a previous employer. I discovered it as oddly worded and vague tweets began floating along my Twitter timeline; today there was a Techcrunch article giving conflicting details of what had happened. In the end I could only confirm that a relatively large number of people were let go, some of whom I knew and had worked with so it wasn't just trimming the newest of hires. In keeping with the "Me me me!" theme, this news caused me to revisit all the thoughts of despair and hopelessness that I felt as my wife drove me home from the apartment after I was told my time there had ended. I empathized with what must be a swirl of confusion and fear that these people now feel. I also watched as people who escaped the cutting block echoed their support for one another and words of sadness to their departed colleagues. Selfishly I felt like the bandage was ripped off an old wound.

I turned 40 this week.

Nothing I thought was going to happen as a teen happened. Getting older shifted into a pattern where almost every day blended into the next; mostly unremarkable, smeared with a veneer of depression and frustration, life is mostly a comfortable pattern of routine. I expected it to be yet another average day, but this birthday was marked with some surprises. Some good. Some bad. But one thing this birthday wasn't is uneventful.

Wednesday, October 18, 2017

Reflection on Coding

There's a subject I've been thinking about lately. I suppose it's more of a feeling than a topic; I'm not even sure how to put it into words.

I have a vague feeling that I've discussed it before, too. In some form. On the other hand, maybe writing about it will help get it out of my head.

The best I've managed to do to express this feeling is to frame it as "elegant beauty," or a kind of beauty that comes from expression through the logic of programming.

It's not that this is an entirely new concept. I've often read descriptions of Ruby as poetic, and there are other works that try examining questions like whether programming is more art than science, or whether programming is poetry.

Perhaps part of this is my own brains weird wiring. I sometimes have trouble understanding poetry; good poetry can "work" on so many levels. Clever word use, double entendre, use of linguistic beats to emphasize points, references to other events and works, parallels to other art forms...I'm sure my wife, an English major, is able to expound on (and expand) the topic far more than I.

Programming adds yet another dimension: it is functional. It takes a language, with its own unique grammar and syntax, and processes input into something else. It's an expression of formulas through rules. If you get the syntax wrong, your work won't compile into a finished product. Programming is notoriously unforgiving when straying from the language rules.

And yet programs that take a set of input and produce the same output can still have so much variety!

I suppose a simple example can use the infamous FizzBuzz program. It's a staple of many a coding interview; relatively simple, it has, over time, become almost cliche (and in some circles, despised, depending on the blogs you read and the type of programmer bemoaning how demeaning it is to be asked to demonstrate it...)

The rules are simple; usually some variant of, "Count from 1 to 100, and if a number is divisible by 3, print "Fizz." If it is divisible by 5, print "Buzz". If it is divisible by 3 and 5, print "FizzBuzz." Otherwise, print the number.

The simplest and most crude way to program this is to literally lay out a program that counts from 0 to 100 and use if statements to output Fizz, Buzz, and FizzBuzz in the appropriate places. It would achieve the goal of the rules, but be highly inefficient and inflexible.

The next step up might be something like this:

// FizzBuzz
package main

import (
 "fmt"
 "strconv"
)

func main() {

 // Create a loop to count 1 to 100
 for i := 1; i <= 100; i++ {

  // Create a string variable that gets reinitialized each iteration
  var strOutput string
  strOutput = ""

  // Fizz on 3
  if i%3 == 0 {
   strOutput = strOutput + "Fizz"
  }
  // Buzz on 5
  if i%5 == 0 {
   strOutput = strOutput + "Buzz"
  }
  // Otherwise, output the number
  if strOutput == "" {
   strOutput = strconv.Itoa(i)
  }
  // Print the result
  fmt.Println(strOutput)
 }

}

If you know modulo, FizzBuzz is a pretty straightforward logic problem. But what if you didn't know about that piece of math?

// fizzbuzz-simple.go
package main

import (
 "fmt"
 "strconv"
)

func main() {

 for a := 1; a <= 100; a++ {

  var strOutput string = ""

  intTmp := a / 3
  if intTmp*3 == a {
   strOutput = "Fizz"
  }

  intTmp = a / 5
  if intTmp*5 == a {
   strOutput = strOutput + "Buzz"
  }

  if strOutput == "" {
   strOutput = strconv.Itoa(a)
  }

  fmt.Println(strOutput)
 }

}

This is probably a little slower...to be honest, I'm not sure if the compiler would optimize this into similar binary algorithms. But the end result is still the same.

The first issue I'd have with the basic implementation is that it's not very modular. It might be better to use a function to determine the fizzing and the buzzing.


// fizzbuzz-func.go
package main

import (
 "fmt"
 "strconv"
)

func main() {

 // Create a loop to count 1 to 100
 for i := 1; i <= 100; i++ {

  // Fizz on 3
  strOutput := CheckMod(i, 3, "Fizz")

  // Buzz on 5
  strOutput = strOutput + CheckMod(i, 5, "Buzz")

  // Otherwise, output the number
  if strOutput == "" {
   strOutput = strconv.Itoa(i)
  }

  // Print the result
  fmt.Println(strOutput)
 }

}

func CheckMod(intCount int, intCheck int, strLabel string) string {

 if intCount%intCheck == 0 {
  return strLabel
 } else {
  return ""
 }

}

This version includes a simple CheckMod() function that can be called to see if the remainder when divided by a supplied integer should get a label; now it takes minimal editing to change the numbers for which Fizz, Buzz, or FizzBuzz are used as output!

And, of course, this still has the same output as the previous versions.

But what if we don't want to keep modifying the source code to alter the Fizz and Buzz triggers? That's simple too.

// fizzbuzz-func-flags.go
package main

import (
 "flag"
 "fmt"
 "strconv"
)

func main() {

 intCountTo := flag.Int("countto", 100, "Count from 1 to this number")
 intFirstNum := flag.Int("firstnum", 3, "First number to label")
 strFirstLabel := flag.String("firstlabel", "Fizz", "First label to substitute")
 intSecondNum := flag.Int("secondnum", 5, "Second number to label")
 strSecondLabel := flag.String("secondlabel", "Buzz", "Second label to substitute")
 flag.Parse()

 // Create a loop to count 1 to x
 for i := 1; i <= *intCountTo; i++ {

  // Fizz on y
  strOutput := CheckMod(i, *intFirstNum, *strFirstLabel)

  // Buzz on z
  strOutput = strOutput + CheckMod(i, *intSecondNum, *strSecondLabel)

  // Otherwise, output the number
  if strOutput == "" {
   strOutput = strconv.Itoa(i)
  }

  // Print the result
  fmt.Println(strOutput)
 }

}

func CheckMod(intCount int, intCheck int, strLabel string) string {

 if intCount%intCheck == 0 {
  return strLabel
 } else {
  return ""
 }

}

Now there are command line flags that designate the Fizz and the Buzz (as well as possible new labels for Fizz and Buzz) and the number to count to!

Because there are defaults added in to the flag variables, the default version of this...with no flags set at the command line...will have identical output to the previous applications.

This version added quite a bit of flexibility to the program, and that flexibility is accessible from the command line by the end user. There is another problem, though; if you intend for an end user to use this application, there should be some sanity checking for the things they can change.

// fizzbuzz-func-flags-errcheck.go
package main

import (
 "flag"
 "fmt"
 "os"
 "strconv"
)

// A struct of flags
type stctFlags struct {
 intCountTo     *int
 intFirstNum    *int
 strFirstLabel  *string
 intSecondNum   *int
 strSecondLabel *string
}

func main() {

 var strctFlags stctFlags

 strctFlags.intCountTo = flag.Int("countto", 100, "Count from 1 to this number")
 strctFlags.intFirstNum = flag.Int("firstnum", 3, "First number to label")
 strctFlags.strFirstLabel = flag.String("firstlabel", "Fizz", "First label to substitute")
 strctFlags.intSecondNum = flag.Int("secondnum", 5, "Second number to label")
 strctFlags.strSecondLabel = flag.String("secondlabel", "Buzz", "Second label to substitute")
 flag.Parse()

 EvalFlags(&strctFlags)

 // Create a loop to count 1 to 100
 for i := 1; i <= *strctFlags.intCountTo; i++ {

  // Fizz on 3
  strOutput := CheckMod(i, *strctFlags.intFirstNum, *strctFlags.strFirstLabel)

  // Buzz on 5
  strOutput = strOutput + CheckMod(i, *strctFlags.intSecondNum, *strctFlags.strSecondLabel)

  // Otherwise, output the number
  if strOutput == "" {
   strOutput = strconv.Itoa(i)
  }

  // Print the result
  fmt.Println(strOutput)
 }

}

func EvalFlags(strctFlags *stctFlags) {

 if *strctFlags.intCountTo <= 0 {

  fmt.Println("-countto must be greater than 0")
  os.Exit(1)
 }

 if *strctFlags.intFirstNum <= 0 {

  fmt.Println("-firstnum must be greater than 0")
  os.Exit(1)
 }

 if *strctFlags.strFirstLabel == "" {

  fmt.Println("-firstlabel must have a text label")
  os.Exit(1)
 }

 if *strctFlags.intSecondNum <= 0 {

  fmt.Println("-secondnum must be greater than 0")
  os.Exit(1)
 }

 if *strctFlags.strSecondLabel == "" {

  fmt.Println("-secondlabel must have a text label")
  os.Exit(1)
 }

 // Done
 return
}

func CheckMod(intCount int, intCheck int, strLabel string) string {

 if intCount%intCheck == 0 {
  return strLabel
 } else {
  return ""
 }

}

Now the application checks for things like labels being set to some kind of string and not an empty string, and all the numbers are set to something greater than 0. Basic error checking.

And once again...the output, by default, will match the output of the previous programs!

These are all rather straightforward. It doesn't really take advantage of features specific to Go, like channels (Here is the link to the Go playground implementation from Russ Cox, reproduced here:)


package main

import "fmt"

func main() {
 c := generate()
 c = filter(c, 3, "Fizz")
 c = filter(c, 5, "Buzz")
 for i := 1; i <= 100; i++ {
  if s := <-c; s != "" {
   fmt.Println(s)
  } else {
   fmt.Println(i)
  }
 }
}

func generate() <-chan string {
 c := make(chan string)
 go func() {
  for {
   c <- ""
  }
 }()
 return c
}

func filter(c <-chan string, n int, label string) <-chan string {
 out := make(chan string)
 go func() {
  for {
   for i := 0; i < n-1; i++ {
    out <- <-c
   }
   out <- <-c + label
  }
 }()
 return out
}

I should note that I created a blog past blog post that explored the channels implementation above...

The simple Fizz Buzz test in the forms above have the same output, but it's accomplished in many ways. I'm sure there are people who would be able to send variations that also have the same end result using a different algorithmic logic; logical, and possessing a strict set of rules that must conform to the expectations of the compiler, but still arriving to the same destination through different means.

To understand the source code means twisting your brain into understanding how the programmer responsible for the source code thinks and expresses his or her way of thinking against those rules of the programming language's grammar and syntax.

The examples above are a peek into some of the evolution in my own thinking about how to program a task, how my own thinking in Go had gradually focused on aspects to increase maintainability and flexibility while accomplishing a goal. I wonder if this is the kind of evolution that is looked for by interviewers for programming jobs...although that's a dangerous thought, considering that the expectation for defining the rungs of skill on that ladder of skill could be dangerously arbitrary.

I'm still refining my methods of modeling tasks when programming. I'm changing workflows, how I comment, and what I comment. I still occasionally reel back, perplexed, when seeing some samples of other people's code and have no idea why...or how...they thought the problem through the way they did.

Each sample I write or read is a reflection of the person who wrote it.

Sometimes I wonder what my own reflects about me.