Monday, August 03, 2009

An interesting task

Over on the OzDotNet mailing list, one of the posts was asking if it’s possible to detect if there is an active exception. I believe the purpose is to change the behaviour of a method that is being called from a catch block.

While I suspect the what I consider to be the “best” solution is to pass a parameter into the function, the idea really hit a “geek spot” somewhere deep inside me. So, without further ado:

   1:          public static bool InCatchBlock()


   2:          {


   3:              StackTrace stackTrace = new StackTrace();


   4:   


   5:              bool inCatchBlock = false;


   6:   


   7:              foreach (StackFrame stackFrame in stackTrace.GetFrames())


   8:              {


   9:                  MethodBody body = stackFrame.GetMethod().GetMethodBody();


  10:   


  11:                  if (body != null)


  12:                  {


  13:                      foreach (ExceptionHandlingClause clause in body.ExceptionHandlingClauses)


  14:                      {


  15:                          bool isFinally = clause.Flags == ExceptionHandlingClauseOptions.Finally;


  16:   


  17:                          if (!isFinally && stackFrame.GetILOffset() >= clause.HandlerOffset && 


  18:                              stackFrame.GetILOffset() < clause.HandlerOffset + clause.HandlerLength)


  19:                          {


  20:                              inCatchBlock = true;


  21:   


  22:                              break;


  23:                          }


  24:                      }


  25:   


  26:                      if (inCatchBlock)


  27:                      {


  28:                          break;


  29:                      }


  30:                  }


  31:              }


  32:   


  33:              return inCatchBlock;


  34:          }




This function above, simply walks the current call stack and checks at each frame to see if we are inside a declared catch block. It seems to work fine for the limited testing I’ve done.



On a related note, I personally believe that code should not care about where it’s called from, as it creates intimate coupling with upstream code, which is more likely than not to create issues with your code.

Tuesday, April 07, 2009

National Broadband Network

Wow, it’s been months.. no, years since this process started. Finally today we got the announcement we were all waiting for. Who is going to build the NBN?? Who won, who can provide the best service??

Well, it turns out that nobody was a winner. The government has cancelled the request for tender process and has decided to go it alone.

So, the plan?? A new Government owned business, who will implement a new network, implemented over the next 8 years. $4.7 billion of initial capital, but the plan is for a total of $43 billion for the full 8 years..

Personally, I think this is a very interesting result. The NBN along with Voice over IP, the Social Internet and Mobile communications will effectively make Telstra’s existing infrastructure obsolete… I guess we will see soon what Telstra plans to do. Will they build a competing network?? Lower prices so they can actually compete?? I do hope this move brings competition to the market, and that the Governments moves will produce a workable/usable network.

Monday, April 06, 2009

A growing shrinking problem

A growing trend around the net lately, has been shrinking URLs. This isn’t a new thing, it’s been around for several years thanks to tinyurl and a few other sites.

The purpose of shrinking URLs is to make re-typing addresses easier, to make the links neater and to cut down on space.

Twitter has benefitted massively from shrinking urls. With such a small limit on message length, it’s means users can have a URL AND a little bit of info in their message. It’s a win-win.

Unfortunately, it seems that more places are also adopting the process of URL shrinking, in some cases, with very little benefit and some big downsides for me.

So, what’s the problem?? I regularly use a little feature that exists in nearly every browser, I like to look at where a link points to before deciding if I’ll click on it. See, it’s very easy to have a URL A Nice Site with Puppy Dogs that really points to www.somebadurl.com. Personally, I’d not click on the link despite the promise of puppy dogs.

Shrinking urls, unfortunately hides the true destination of a hyperlink, and as such, means that I am running blind. This means, I have to use my best judgement based on trust. Do I trust the person/site that posted the link. In general, this isn’t to bad.

But this is where it’s getting more difficult. Several social media sites are now actively shrinking all URLs posted on their site. These links can be posted by anybody, people I don’t know, people I don’t trust. The result, the sites no longer have my patronage. Sure, I’m only one person, but I’d rather be safe than run the risk of something far nastier.

Tuesday, March 24, 2009

News Flash: Wally has been found

It’s been years, many books, and finally the day has come. Here is Wally

Friday, March 20, 2009

A Quick history of the internet

Anybody who thinks Microsoft haven’t gotten their mojo back only have to stop and grab IE8 and take a look at this video (IE8 not required)..

Thursday, March 19, 2009

IE8 News

Straight from the horses mouth via twitter. “@NickHodge IE8 Final: you'll be able to download it from 3:00am AEST tomorrow. http://www.microsoft.com/ie8

So, I guess tomorrow morning I’ll be updating my IE8 RC installs.. w00t!!

Wednesday, March 18, 2009

The Joy of Development

One of the most rewarding parts of my job is delivering new software that makes a difference for my clients. It’s part of the reason I’m still working in the industry, it makes me feel good.

The other thing I really enjoy, and something that has happened today, is the ability to make a program (or small part) run faster. It was just a simple query, with a few tables, a view and sub query or two. In one of our environments the query run blazingly fast, taking only 2-3 seconds. In all of our other environments (one of which is an exact replica from 2 days ago) took a long time too complete. By a long time, I mean it took more than 15 minutes before I lost patience and stopped the process.

So, what changes did I make? Simple, it was just a restructure of the query. As I mentioned before, there was a sub query. This sub query was used within the where clause. An example is:

WHERE Table1.Field1 In (SELECT Field2 FROM Table2.Field2 WHERE Enabled = ‘Y’)

became:

FROM Table1 INNER JOIN (SELECT Field2 FROM Table2 WHERE Enabled = ‘Y’) as Table3 On Table1.Field1 = Table3.Field2

By moving this into the From section and making it an Inner Join, I was able to help the optimiser make the decision to apply the filter earlier in the execution.

The result, ever environment now runs the query sub second.

You may ask yourself, how did I know where to look? The answer is all in the tools you use. Today, I was using Toad, and a simple “Explain” on the query quickly shows you where the execution cost is. SQL Management Studio and many other tools can all provide an execution plan that you can use. There are a few things that you should focus on when looking over an execution plan. The two I focus on the most are Cost and Full Table Scans.

Cost provides you with a figure relative to the whole query about how expensive that operation is. If an operation is excessively expensive then you should try and simplify it.

Full Table Scans generally occur when there are no suitable indexes in place. This means that a filter cannot occur on an index and instead “scans the whole table”. As you can imagine, on a large table this can be a very time consuming process.

There is plenty more information available over the web on this topic. This is just one of my favourite (and easy) fixes for a very common database performance issue.

* This is a very simplified example of the actual query

Thursday, March 12, 2009

Pick a Search Engine

A few weeks ago, I woke up early in the morning and made a decision too do something different. For the life of me, I didn’t know what I wanted to do, but I had a need too change something. That day happened to be the same day that Google decided to join the EU bandwagon and complain about Microsoft bundling I.E. with Windows. This news rubbed me the wrong way, it’s not like Google has been massively effected by this, as they’ve only recently bought out Chrome.

Anyway, after reading that, I made my decision, I was going to try and live a life without Google.

So far, it’s proven a little more difficult than I’d like, not because I rely on them, but more because of habit.

The only habit I’ve successfully changed is my web searching (because I just changed the default search engine in IE8). I’ve tried to break other habits, such as using Live Maps, but alas, the coverage in Australia is just not as good as Google Maps. Just today as an example, I did a search for Moore Street in the ACT. For anybody who knows Canberra, you may know that this street is right in the Centre of the City. Unfortunately Live Maps still can’t find it.. Luckily, Where Is came too the recue.

I’m still trying to find my feet in this Google-Free world, and while I’m sure I may never be completely free, I am pleasantly surprised that the world hasn’t come tumbling down yet.

*Yes, I’m well aware my blog is hosted by Google, I’m still not sure if I’m going to relocate it or not.

Google and the Linux desktop… Oh please…

People have been jumping up and down in the vein hope that Google will move into the desktop space. I’ve heard this discussion before, yet here it is again.

It’s an interesting idea. Start by slowly working around vendor lock in, to position yourself with an end too end replacement for existing infrastructure then BAM, put out an OS too free the people.

Unfortunately, I think a lot of people seem too be forgetting a few simple facts. The most important of these is the business that Google is. Google runs a search engine. But, Google the company is in the business of Advertising. Yep, that’s right, providing the ability too search is just a way to bring people in too view advertisements.

Google docs, Gmail and Google Calendar are all the same. Provide a service too bring people in, then serve up some ads. That’s the business Google is in.

All of the articles/discussions I’ve read about Google on the Desktop have all talked about some sort of Linux Distribution. Great, it could be cool. But Google will still want their advertising cake. Are they going to modify their own Linux distribution too include built in advertising? Are they going too remove the popular email clients, calendaring applications and office applications in order too force users too continue using their online, advertising supported applications?

The answer too this is clearly no. Google will not give up their revenue stream.

I think a more likely scenario is a Web Based desktop. Something that keeps users away from the PC based desktop, and all PC based desktop apps, and keep the users working in the Advertising based world that Google is clearly the king.

Wednesday, February 11, 2009

There is no Right Join

From very early on in my career, I’ve coded SQL by hand. This is nothing particularly special, but it always astounds me how many people still get confused over the various types of joins.

Over at Code Project, I just stumbled across an article that does a pretty decent job of explaining the various join options, but it did remind me of something very interesting.. See, in my 11 years in the industry, I have never had a need for a right join. Why?? Simply put, Right Joins are just backwards left joins.. If you get your table ordering correct, then right joins don’t exist.

So, I hereby call for an end too useless right joins…

Monday, February 09, 2009

I used too just melt them..

Customised My Little Ponies..

A new phone, and an interesting system

Well, over the weekend I finally took the plunge. I said goodbye to my trusty old JasJam and got myself a HTC Touch Pro.

Along with this new phone, I also changed my telco. I had been with my previous telco for over 11 years, and in general hadn’t had any major issues. That was until I started looking at the Touch Pro. See, the Touch Pro was released exclusively to a single provider here in Oz for 3 months. That 3 month period started back in October last year. Fast forward 4 months, and the phone is still only available at one telco.

I contacted my previous telco, through several different methods, via phone, email and in person at their stores. Unfortunately, this is where things started too turn for me. I couldn’t get an answer about this phone. I got everything from “Would you like to just get another phone” to “What’s a Touch Pro”. This really struck me as strange. Nobody seemed to know anything, not their sales people, not their customer support, nobody. Any company that can’t answer a simple question about product availability has some serious problems.

Anyway, back too the new telco. So far I’m reasonable happy, but one thing did strike me as odd. See, as usual, the telco had to do a “credit check”. But the format of the credit check was by far the most interesting I’ve seen. It involved a few quick questions:

  • Are you employed
  • How long have you been employed
  • What type of employment

From this, they managed to approve me for 3 services. So, I could pick up 3 phones on plans. Wow, that’s great.. The problem is, the system in no way took into account my income, the plans I was going to get, the monthly repayments or even my other commitments. I even confirmed this with the customer representative, I could get 3 brand new phones, all on $200 plans..

Tuesday, February 03, 2009

CodeCampOz 2009

That’s right folk, Code camp is back for another year, and anybody who’s been before knows it’s worth making the trip.

Mitch has announced the details and registration is now open.

Unfortunately, this year I won’t be there. See, it happens to be on my birthday..

Monday, February 02, 2009

Interesting Problem

I came into work this morning, to find my inbox full of errors from one of our production applications. This application has been running full time for a long time.  The exception that our app was throwing was:

The security timestamp is invalid because its creation time ('xx/xx/xxxx xx:xx:xx PM') is in the future. Current time is 'xx/xx/xxxx xx:xx:xx PM' and allowed clock skew is '00:05:00'

After a little searching (ok, it took 30 seconds), I stumbled across a great post about this.

A little further digging and my assumption was correct, this is actually tied to Kerberos.

Anyway, in this case, I didn’t follow the easy solution of changing the binding behaviour, instead I got out systems guys to ensure that all the machines in question have their clocks correctly syncronised.. It seemed like a better long term solution.

Wednesday, January 28, 2009

Test Data

I’ve always been a big proponent of using the best test data possible. As a developer, I find that it’s very easy to get lost in the details of implementation, and tend to leave the generation of test data till a later stage of a project.

The problem I find is that by leaving the generation of test data till the end, more often than not, I end up only testing edge cases (which is extremely important), but I tend to not spend enough time generating large quantities of normal data. A result of this is that it’s easy to miss performance problems.

So, as of today, I’m making sure I have sufficient amounts of test data up front. Best of all, I’m focusing on automated data population from various sources. One of my favourite sources for this is Wikipedia. For any large body of text, I find just grabbing a random article is perfect.

Wednesday, January 21, 2009

Annual Noise Cleanout

Well, it’s that time again, yep, I’m cleaning out my list of blogs, twitter friends and every other bit of noise that I find is just not providing any use.

Unfortunately, while examining what I’m currently subscribed too, I noticed something that was a little disturbing. Once, a long time ago, I had most of the MS Oz PDE team in my feed, as of today, I’m down too one.

I used too find the information coming out of this group extremely useful. In particular, Information about Dev tools, events and new technologies. I’ve found over time, these blogs have reduced to general noise about peoples lives, or have just plain gone quiet. They no longer serve the purpose I had subscribed to them for.

This isn’t the only group I’ve cleared out, but it’s certainly one of the most disappointing ones. On the bright side, over the last year, I have picked up a number of feeds from overseas that have replaced the ones I’m now removing.

Ed.